Patterns in language yield patterns in thought. Extensive research has now demonstrated that differences between languages can yield differences, often subtle ones, in the cognitive habits of their speakers. This finding, commonly referred to as linguistic relativity, has now been supported by dozens of studies on topics like spatial awareness, the perceptions of time, and the categorisation of colours. For instance, “where” the future and past “are” depends on the language you speak. Similarly, the manner in which you recall and discriminate colours is affected in sublte ways by the basic colour term inventory of your native language. Our tour of the numberless worlds ultimately led to the conclusion that numeric language also yields difference in how people think. Number words, present in the vast majority of the world’s languages (though not all of them), certainly influence quantitative cognition. Only those people who are familiar with number words and counting can exactly differentiate most quantities. The presence of numbers in a language does not just subtly influence how we think about certain quantities, then; it also opens up a door to the world of arithmetic and mathematics. The first step through that door is the realisation that quantities, regardless of size, can be precisely differentiated. But how exactly do numbers first open this door? And what happens after we walk through it?
The findings from numberless worlds suggests plainly that we need numbers to really “get” quantities in ways that are uniquely human, but, this raises a paradox. If we need numbers to appreciate most quantities precisely, how did we get numbers in the first place? How could we ever name the amounts in particular sets of items, if we could not recognise the amount?
Given the apparent intractability of this paradox, some have concluded that humans must be innately predisposed to acquire number concepts. But, if we are predisposed to recognise different set sizes as separate abstract entities, then what is the limit to this predisposition? Are we naturally predisposed, for example, to eventually realise that 1,023 is not 1,024? This seems fairly implausible. Framed differently, nativist views on numbers just delay the point at which we reach the paradox.
James Hurford noted that number words are names for the “non-linguistic entities denoted by numbers.” That is, the number words label conceptual entities. In a related vein, Karenleigh Overmann recently suggested that “quantity concepts must surely precede their lexical labels, or there would be nothing to name… A method of invention cannot presuppose that which it invents.” This latter stance is understandable, but it arguably trivialises the extensive evidence, according to which, words for quantities beyond three do not simply label pre-existing concepts, because these concepts do not exist for most people until they actually learn numbers.
In my view, this is the key to resolving the paradox: words for quantities beyond three make concrete the precise numerical abstractions that are only occasionally and inconsistently made by some people. Some of these people may eventually invent numbers, but if they do not, their fleeting abstractions are not transferred to others. The naming of such ephemeral realisations is what eventually enables people to consistently show the ability to make a simple but powerful realisation, the realisation that sets of quantities greater than three can be identified precisely. This simple realisation has led, in all likelihood more times than could be documented, to the invention of symbols for such larger quantities. These symbols are chiefly verbal in nature, judging from the fact that the overwhelming majority of the world’s cultures have words for such quantities though most cultures traditionally lack written numerals or elaborate tally systems. Some people invented number words to concretise the potentially transient recognition of the existence of exact higher quantities.
Does this mean that number words simply serve as labels for the concepts? Not really. The truth seems a bit more nuanced than the forced dichotomous choice assumed by the paradox. Number words are not simply labels, yet they do describe conceptual realisations that some people make some times. The term ‘label’ implies that the words simply denote concepts that we all think about: concepts all humans are born ready to appreciate (at least eventually), regardless of their cultural environment. But clearly not all humans have such concepts at the ready even as adults, and likely most people would never make the relevant realisations that can be described via numbers. Just as clearly, though, some people have made those realisations, even if inconsistently. In those real historical cases in which people managed to describe that realisation with a word, they invented numbers. The concept they named was subsequently recognised by other members of their culture through the adoption of the relevant word(s). Number words are conceptual tools that get passed around with ease, tools most people want to borrow.
Caleb Everett, Numbers and the Making of us: Counting and the Course of Human Cultures
More from Danny Shine.
Nature is that which has always been there. This is the thinking of Heraclitus. In his eyes, it has always been made up of the world (cosmos) as what “was, is and will be.” This is to make Nature finite, to diminish its power. Nature did not create itself, that is to say permanently structure itself into the world, but unceasingly and tirelessly builds itself and becomes finite by forming itself into a multiplicity of worlds. This means that it breaks up into innumerable worlds that are not at all eternal, but are born and perish. It is like a perpetual laboratory of endless and multiple trials because it is not only one order (cosmos) that is born of Nature, but all systems of the order are born of it at one time or another.
By his cosmology, Heraclitus is the ancestor of Plato’s followers. However, by his panta rhei, “everything flows,” he is the prime example of all the philosophies of movement, from Montaigne to Bergson, before and after. Furthermore, what is the Tao, according to Lao Tzu, but “perpetual mutability itself,” that is to say Heraclitus’s river? Yet it must be added: with certain characteristics of Anaximander’s Phusis, because the “Path” (Tao), which is infinite in that it is unqualified, undetermined, and conceptually incomprehensible, is also the source and principle of birth and growth for individual beings: differentiating themselves and becoming finite, it thus deploys a generative force, Te – a word which is generally translated as “Virtue.” Nothing prevents this “Virtue” from showing itself in innumerable worlds.
Steven Pinker is, of course, both clever and influential, and there is much that I would agree with him about. So when he makes what he calls an impassioned plea for an understanding between science and the humanities, something that I feel strongly about, too, and indeed believe to be of the greatest importance for our future, it seems churlish to find fault, especially as I am grateful to him for the opportunity to explore in more detail issues about which it is obvious we both care very much. But for all that he claims to be setting out to reassure his colleagues in the humanities, I doubt that his essay will have the desired effect. In fact I fear that it may appear to some to exemplify everything that those in the humanities fear to be the case about the contemporary science establishment.
The marriage, or at any rate the peaceful cohabitation, of science and the humanities is essential for the health of our civilisation. I speak as someone who has a foot in each camp, and an interest in their rapprochement. I agree wholly with Professor Pinker that each can learn from the other. And Professor Pinker is right to recognise that all is not as well as it might be in this relationship. Perhaps he feels he is offering therapy.
However in any relationship there are at least two points of view, and two stories to tell about where the trouble lies. And to engage successfully in therapy you need to see both.
Before the emergence of empirical methodology – which allowed for methodical separation of subject and object in description – the world-model contained abstracted inferences about the nature of existence, derived primarily from observations of human behavior. This means, in essence, that pre-experimental man observed “morality” in his behavior and inferred the existence of a source for that morality in the structure of the “universe” itself. Of course, this “universe” is the experiential field – affect, imagination and all – and not the “objective” world constructed by the post-empirical mind. This prescientific “model of reality” primarily consisted of narrative representations of behavioral patterns (and of the contexts that surround them), and was concerned primarily with the motivational significance of events and processes. As this model became more abstract – as the semantic system analyzed the information presented in narrative format, but not “understood” – man generated imaginative “hypotheses” about the nature of the “ideal” human behavior, in the “archetypal” environment. This archetypal environment was (is) composed of three domains, which easily become three “characters”:
In the field of religion there are dogmatists of no-faith as there are of faith, and both seem to me closer to one another than those who try to keep the door open to the possibility of something beyond the customary ways in which we think, but which we would have to find, painstakingly, for ourselves. Similarly as regards science, there are those who are certain, God knows how, of what it is that patient attention to the world reveals, and those who really do not care, because their minds are already made up that science cannot tell them anything profound. Both seem to me profoundly mistaken. Though we cannot be certain what it is our knowledge reveals, this is in fact a much more fruitful position – in fact the only one that permits the possibility of belief. And what has limited the power of both art and science in our time has been the absence of belief in anything except the most diminished version of the world and our selves. Certainty is the greatest of all illusions: whatever kind of fundamentalism it may underwrite, that of religion or of science, it is what the ancients meant by hubris. The only certainty, it seems to me, is that those who believe they are certainly right are certainly wrong. The difference between scientific materialists and the rest is only this: the intuition of the one is that mechanistic application of reason will reveal everything about the world we inhabit, where the intuition of the others leads them to be less sure. Virtually every great physicist of the last century – Einstein, Bohr, Planck, Heisenberg, Bohm, amongst many others – has made the same point. A leap of faith is involved, for scientists as much as anyone. According to Max Planck, ‘Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: Ye must have faith. It is a quality which the scientist cannot dispense with.’ And he continued: ‘Science cannot solve the ultimate mystery of nature. And that is because, in the last analysis, we ourselves are part of nature and therefore part of the mystery that we are trying to solve.
In this book certainty has certainly not been my aim. I am not so much worried by the aspects that remain unclear, as by those which appear to be clarified, since that almost certainly means a failure to see clearly. I share Wittgenstein’s mistrust of deceptively clear models: and, as Waismann said, ‘any psychological explanation is ambiguous, cryptic and open-ended, for we ourselves are many-layered, contradictory and incomplete beings, and this complicated structure, which fades away into indeterminacy, is passed on to all our actions.’ I am also sympathetic to those who think that sounds like a cop-out. But I do think that things as they exist in practice in the real world, rather than as they exist in theory in our re-presentations, are likely to be intrinsically resistant to precision and clarification. That is not our failure, but an indication of the nature of what we are dealing with. That does not mean we should give up the attempt. It is the striving that enables us to achieve a better understanding, but only as long as it is imbued with a tactful recognition of the limits to human understanding. The rest is hubris.
If it could eventually be shown definitively that the two major ways, not just of thinking, but of being in the world, are not related to the two cerebral hemispheres, I would be surprised, but not unhappy. Ultimately what I have tried to point to is that the apparently separate ‘functions’ in each hemisphere fit together intelligently to form in each case a single coherent entity; that there are, not just currents here and there in the history of ideas, but consistent ways of being that persist across the history of the Western world, that are fundamentally opposed, though complementary, in what they reveal to us; and that the hemispheres of the brain can be seen as, at the very least, a metaphor for these. One consequence of such a model, I admit, is that we might have to revise the superior assumption that we understand the world better than our ancestors, and adopt a more realistic view that we just see it differently – and may indeed be seeing less than they did.
Iain McGilchrist, The Master and His Emissary: The Divided Brain and the Making of the Western World.