Advanced Search

Is there a role of mathematics in the development of human consciousness?

Is there a role of mathematics in the development of human consciousness?

In addition to Hofstadter's wonderful writings, you might also be interested in work done on the relationships between mathematics and cognition (more generally than just consciousness). Take a look at these classics in that area:

Rochel Gelman & C.R. Gallistel, The Child's Understanding of Number (Harvard University Press, 1986)

George Lakoff & Rafael Nuñez, Where Mathematics Comes From: How the Embodied Mind Brings Mathematics into Being (Basic Books, 2001)

Stanislaus Dehaene, The Number Sense: How the Mind Creates Mathematics (Oxford University Press, 2011)

interesting question. not sure I understand it exactly. but I can refer you to some fascinating work that touches on it -- mostly anything by Douglas Hofstadter, but you might start with Strange Loops and/or Godel Escher Bach ... both spectacular works that trace the essence of consciousness to self-referential recursive (mathematical) processes ... he'd be a good place to start. best, Andrew Pessin

One of the obvious ways computers are limited is in their representation of

One of the obvious ways computers are limited is in their representation of numbers. Since computers represent numbers as bit strings of finite length, they can only represent finitely many, and to a finite degree of precision. Is it a mistake to think the humans, unlike computers, can represent infinitely many numbers with arbitrary precision? We obviously talk about things like the set of all real numbers; and we make use of symbols, like the letter pi, which purport to represent certain irrational numbers exactly. But then I'm not sure whether things like this really do show that we can represent numbers in a way that is fundamentally beyond computers.

This one is basically above my pay grade, but I'll take a stab. I share your doubt that humans "can represent infinitely many numbers with arbitrary precision" in any way beyond what we find with computers. After all, our own hardware (our brain) is finite in the same ways/senses as are computers, so if sheer finitude establishes the limits of representation it's hard to see why we would differ from computers. If, on the other hand, you're imagining this as an argument for dualism -- i.e. our minds are distinct from our brains because they have infinite capacity in a way that our brains don't -- then you would definitely first have to prove the infinite capacity of our minds. Simply writing or thinking "pi" isn't enough; the fact that "pi" represents something infinitely expandable/expanded doesn't make the symbol "pi" infinite. The clearest proof would be if we could grasp (say) the complete infinite expansion of pi in one mental glance -- but we can't. At best we can grasp THAT the expansion goes on to infinity, just as we can grasp THAT the natural numbers go on to infinity. That's at least one important sense in which we have a concept of infinite, in which our minds represent the infinite -- and while philosophers such as Descartes/Malebranche might invoke that in their argument for dualism, it doesn't strike me as very convincing. Realizing "I can always add 1" just doesn't strike me as a thought that is interestingly infinite in content, rather merely as one that refers to the infinite without fully capturing it. And as Aristotle suggested, we must distinguish the "possibly infinite" from the "actually infinite" -- when we grasp that some sequence "goes on to infinity" we are grasping that, were we to actually survive to infinity, we would not complete (computing) the sequence -- but that is something less than saying the infinite actually exists. As far as I know, computers are just as able to represent the infinite as we are (in this sense), and this sense falls short of supporting dualism. Putting all that together, I don't think we've got a case for distinguishing the capacities/nature of minds v computers this way.

This one is basically above my pay grade, but I'll take a stab. I share your doubt that humans "can represent infinitely many numbers with arbitrary precision" in any way beyond what we find with computers. After all, our own hardware (our brain) is finite in the same ways/senses as are computers, so if sheer finitude establishes the limits of representation it's hard to see why we would differ from computers. If, on the other hand, you're imagining this as an argument for dualism -- i.e. our minds are distinct from our brains because they have infinite capacity in a way that our brains don't -- then you would definitely first have to prove the infinite capacity of our minds. Simply writing or thinking "pi" isn't enough; the fact that "pi" represents something infinitely expandable/expanded doesn't make the symbol "pi" infinite. The clearest proof would be if we could grasp (say) the complete infinite expansion of pi in one mental glance -- but we can't. At best we can grasp THAT the expansion goes on...

Having an almost three year old daughter leads me into deep philosophical

Having an almost three year old daughter leads me into deep philosophical questions about mathematics. :-) Really, I am concerned about the concept of "being able to count". People ask me if my daughter can count and I can't avoid giving long answers people were not expecting. Firstly, my daughter is very good in "how many" questions when the things to count are one, two or three, and sometimes gives that kind of information without being asked. But she doesn't really count them, she just "sees" that there are three, two or one of these things and she tells it. Once in a while she does the same in relation to four things, but that's rare. Secondly, she can reproduce the series of the names of numbers from 1 to 12. (Then she jumps to the word for "fourteen" in our language, and that's it.) But I don't think she can count to 12. Thirdly, she is usually very exact in counting to four, five or six, but she makes some surprising mistakes. Yesterday, she was counting the legs of a (plastic) donkey (in natural...

Most of these questions are not so much philosophical as empirical, and there has been a tremendous amount of extremely important work done in the last few decades on children's concepts of number. The locus classicus is The Child's Understanding of Number, by Rachel Gelman and Randy Galistel, which was originally published in 1978, but this stuff really took off in the late 1990s or so. A lot of people have contributed to this work, but I'll mention two: Susan Carey and Liz Spelke, who are both at Harvard. You will find links to some of their work on their websites. Part of the reason people got interested in these issues is because they are closely related to issues about object recognition and individuation, which had been a focus of a great deal of work just before that. (I.e, people had been interested in the question at what age children start to "pick out" objects from the environment, and to think of them as distinct entities, that continue to exist even when you do not see them. The answers turn out to be every bit as fascinating as one might hope they would be.)

It turns out that there are several different cognitive systems at work in numerical cognition. One of them is system that works by "pattern recognition". This is the system that your child is using when she just "sees" how many things there are. She's not counting, even to herself, but just recognizing a pattern. Unsurprisingly, this system does not work for very large numbers. If I remember right, it tends to give out around four or five, for most people---and for many animals, too, who share this particular system with us. There's another system that is based on what people call an "analog accumulator" and that, again, we share with many other animals.

Counting, on the other hand, is something that seems found, in nature, at least, only in humans, though I know there are parrots who have been taught to count. Before we talk about counting, though, we need to distinguish two kinds of counting, which are known as "intransitive" and "transitive". (I think the terms originated with Charles Parsons, but I'm not sure.) Intransitive counting is just rehearsing the number sequence, i.e., saying, "1, 2, 3, 4", etc. Transitive counting is using the number sequence to count some objects. Obviously, you have to learn the former before you can learn the latter, and there is almost always a developmental stage where children are good enough at intransitive counting, but quite bad at transitive counting, or even unwilling to do it entirely.

We also need to distinguish the question, "Can so-and-so count transitively?" from the questions (a) how well they do it and (b) whether they understand what they are doing in the way we do. Concerning (a), the distinction we need to make here is Chomsky's distinction between "competence" and "performance". It sounds, from your description, as if your child knows that each item is to be counted (that is, "tagged" with a counting word) once and only once. But knowing this is one thing, and being able to tag each thing just once is another thing. Even we adults make this kind of mistake sometimes, and even a child who almost always makes such a mistake might know what she is supposed to be doing. So in that sense, she might be able to count, but not exhibit this ability in her performance.

Concerning (b), we adults use counting to find out how many things of a certain sort there are, a fact we then use to make other kinds of decisions. If we count five plates and then go to get forks, we also count out five forks, for the obvious reason that this is one way to make sure we have the same number of forks as plates. But there is, again, almost always a developmental stage at which children can count, but they do not understand the significance of the exercise. So if you ask them to count the plates and then go get five forks, they have no idea what to do. Indeed, at this sort of stage, children almost seem to understand the question, "How many plates are there?" as meaning: Would you please count the plates? You can ask them over and over, and they'll count each time; they won't just think, well, I just counted, so there are five, and why are you asking me again? Nor do they understand, e.g., that, if there are five dolls and five hats, then this means that you have a hat for each doll, but no more, or, conversely, that if you have five dolls and a hat for each doll, but no more, then you have five hats. Amusingly, children at this stage will very often use other "count sequences" instead of number words. They'll count "a, b, c" or "Monday, Tuesday, Wednesday", and be perfectly fine with that. The moral of the story, then, is that the ability to count does not, by itself, imply having an adult understanding of "how many" questions and their answers.

So, does your child know how to count? In some ways, yes, and in some ways, perhaps no.

Rather than answer I will merely invoke a classic Sesame Street episode. Grover is counting oranges: one, two, three etc. And again: one, two, three. Then someone else comes in with a basket of apples and asks him to count these as wells. But he breaks into tears. Alas, he can count oranges but he has never learned to count apples. ap

I've read in several places that scientists have estimated the number of atoms

I've read in several places that scientists have estimated the number of atoms in our galaxy to be (very) roughly 10 to the 65th power. This is an extraordinarily huge and basically incomprehensible number. However, this figure is more than 100 times smaller than the number of ways I could arrange the ordinary deck of playing cards I have in my hands. [52 factorial is approximately 8 x 10 to the 67th power]. Pardon the exaggeration, but how can I keep facts like this from melting my brain?

Apparently you have, if you wrote this question! :-)

(People also like to talk about the immense number of neural connections within our brains -- I don't know how that number compares to the ones you mentioned, but I believe it's pretty brain-melting too!)

ap

Apparently you have, if you wrote this question! :-) (People also like to talk about the immense number of neural connections within our brains -- I don't know how that number compares to the ones you mentioned, but I believe it's pretty brain-melting too!) ap

I have been reading discussions on this site about the Principia and about Godel

I have been reading discussions on this site about the Principia and about Godel's incompleteness theorem. I would really like to understand what you guys are talking about; it seems endlessly fascinating. I was an English/history major, though, and avoided math whenever I could. Consequently I have never even taken a semester of calculus. The good news (from my perspective) is that I have nothing to do for the rest of my life except for working toward the fulfillment of this one goal I have: to plow through the literature of the Frankfurt School and make sense of it all. Understanding the methods and arguments of logicians would seem to provide a strong context for the worldview that inspired Horkheimer, Fromm, et al. So yeah, where should I start? Do I need to get a book on the fundamentals of arithmetic? Algebra? Geometry? Or do books on elementary logic do a good job explaining the mathematics necessary for understanding the material? As I said, I'm not looking for a quick solution. I...

1. I don't think there is any reason to suppose that learning about mathematical logic from Principia to Gödel will be any help at all in understanding what is going on with the Frankfurt School. (The only tenuous connection I can think of is that the logical positivists were influenced by developments in logic, and the Frankfurt School were concerned inter alia to give a critique of positivism. But since neither the authors of Principia nor Gödel were positivists, it would be better to read some of the positivists themselves if you want to know what the Frankfurt School were reacting against).

2. Of course, I think finding out a bit about mathematical logic is fun for its own sake: but it is mathematics and to really understand I'm afraid there is not much for it other than working through some increasingly tough books called the likes of "An Introduction to Logic" followed by "Intermediate Logic" and then "Mathematical Logic". Still, you can get a distant impression of what's going on by following links on Wikipedia etc. And on Gödelian matters, Hofstadter's long book is entertainingly illuminating and somewhat annoying in about equal measure. Goldstein's book, though, is hopeless as a guide: see http://math.stanford.edu/~feferman/papers/lrb.pdf for an authoritative demolition (which indeed pulls its punches). If I was going to recommend one book on Gödel as a way in for the non-mathematical, it would be Torkel Franzen's Gödel's Theorem, an Incomplete Guide to its Use and Abuse.

lucky you, with so much time on your hands and with such interesting interests! there are numerous secondary expositions of Godel etc. -- I personally love Douglas Hofstadter's way of explaining it (in Godel Escher Bach and also his more recent Strange Loops) ... but Rebecca Goldstein has a recent book on it (haven't read it, can't speak to its quality) -- http://www.rebeccagoldstein.com/books/incompleteness/index.html good luck ap