Patterns in language yield patterns in thought. Extensive research has now demonstrated that differences between languages can yield differences, often subtle ones, in the cognitive habits of their speakers. This finding, commonly referred to as linguistic relativity, has now been supported by dozens of studies on topics like spatial awareness, the perceptions of time, and the categorisation of colours. For instance, “where” the future and past “are” depends on the language you speak. Similarly, the manner in which you recall and discriminate colours is affected in sublte ways by the basic colour term inventory of your native language. Our tour of the numberless worlds ultimately led to the conclusion that numeric language also yields difference in how people think. Number words, present in the vast majority of the world’s languages (though not all of them), certainly influence quantitative cognition. Only those people who are familiar with number words and counting can exactly differentiate most quantities. The presence of numbers in a language does not just subtly influence how we think about certain quantities, then; it also opens up a door to the world of arithmetic and mathematics. The first step through that door is the realisation that quantities, regardless of size, can be precisely differentiated. But how exactly do numbers first open this door? And what happens after we walk through it?
The findings from numberless worlds suggests plainly that we need numbers to really “get” quantities in ways that are uniquely human, but, this raises a paradox. If we need numbers to appreciate most quantities precisely, how did we get numbers in the first place? How could we ever name the amounts in particular sets of items, if we could not recognise the amount?
Given the apparent intractability of this paradox, some have concluded that humans must be innately predisposed to acquire number concepts. But, if we are predisposed to recognise different set sizes as separate abstract entities, then what is the limit to this predisposition? Are we naturally predisposed, for example, to eventually realise that 1,023 is not 1,024? This seems fairly implausible. Framed differently, nativist views on numbers just delay the point at which we reach the paradox.
James Hurford noted that number words are names for the “non-linguistic entities denoted by numbers.” That is, the number words label conceptual entities. In a related vein, Karenleigh Overmann recently suggested that “quantity concepts must surely precede their lexical labels, or there would be nothing to name… A method of invention cannot presuppose that which it invents.” This latter stance is understandable, but it arguably trivialises the extensive evidence, according to which, words for quantities beyond three do not simply label pre-existing concepts, because these concepts do not exist for most people until they actually learn numbers.
In my view, this is the key to resolving the paradox: words for quantities beyond three make concrete the precise numerical abstractions that are only occasionally and inconsistently made by some people. Some of these people may eventually invent numbers, but if they do not, their fleeting abstractions are not transferred to others. The naming of such ephemeral realisations is what eventually enables people to consistently show the ability to make a simple but powerful realisation, the realisation that sets of quantities greater than three can be identified precisely. This simple realisation has led, in all likelihood more times than could be documented, to the invention of symbols for such larger quantities. These symbols are chiefly verbal in nature, judging from the fact that the overwhelming majority of the world’s cultures have words for such quantities though most cultures traditionally lack written numerals or elaborate tally systems. Some people invented number words to concretise the potentially transient recognition of the existence of exact higher quantities.
Does this mean that number words simply serve as labels for the concepts? Not really. The truth seems a bit more nuanced than the forced dichotomous choice assumed by the paradox. Number words are not simply labels, yet they do describe conceptual realisations that some people make some times. The term ‘label’ implies that the words simply denote concepts that we all think about: concepts all humans are born ready to appreciate (at least eventually), regardless of their cultural environment. But clearly not all humans have such concepts at the ready even as adults, and likely most people would never make the relevant realisations that can be described via numbers. Just as clearly, though, some people have made those realisations, even if inconsistently. In those real historical cases in which people managed to describe that realisation with a word, they invented numbers. The concept they named was subsequently recognised by other members of their culture through the adoption of the relevant word(s). Number words are conceptual tools that get passed around with ease, tools most people want to borrow.
Caleb Everett, Numbers and the Making of us: Counting and the Course of Human Cultures
In the field of religion there are dogmatists of no-faith as there are of faith, and both seem to me closer to one another than those who try to keep the door open to the possibility of something beyond the customary ways in which we think, but which we would have to find, painstakingly, for ourselves. Similarly as regards science, there are those who are certain, God knows how, of what it is that patient attention to the world reveals, and those who really do not care, because their minds are already made up that science cannot tell them anything profound. Both seem to me profoundly mistaken. Though we cannot be certain what it is our knowledge reveals, this is in fact a much more fruitful position – in fact the only one that permits the possibility of belief. And what has limited the power of both art and science in our time has been the absence of belief in anything except the most diminished version of the world and our selves. Certainty is the greatest of all illusions: whatever kind of fundamentalism it may underwrite, that of religion or of science, it is what the ancients meant by hubris. The only certainty, it seems to me, is that those who believe they are certainly right are certainly wrong. The difference between scientific materialists and the rest is only this: the intuition of the one is that mechanistic application of reason will reveal everything about the world we inhabit, where the intuition of the others leads them to be less sure. Virtually every great physicist of the last century – Einstein, Bohr, Planck, Heisenberg, Bohm, amongst many others – has made the same point. A leap of faith is involved, for scientists as much as anyone. According to Max Planck, ‘Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: Ye must have faith. It is a quality which the scientist cannot dispense with.’ And he continued: ‘Science cannot solve the ultimate mystery of nature. And that is because, in the last analysis, we ourselves are part of nature and therefore part of the mystery that we are trying to solve.
In this book certainty has certainly not been my aim. I am not so much worried by the aspects that remain unclear, as by those which appear to be clarified, since that almost certainly means a failure to see clearly. I share Wittgenstein’s mistrust of deceptively clear models: and, as Waismann said, ‘any psychological explanation is ambiguous, cryptic and open-ended, for we ourselves are many-layered, contradictory and incomplete beings, and this complicated structure, which fades away into indeterminacy, is passed on to all our actions.’ I am also sympathetic to those who think that sounds like a cop-out. But I do think that things as they exist in practice in the real world, rather than as they exist in theory in our re-presentations, are likely to be intrinsically resistant to precision and clarification. That is not our failure, but an indication of the nature of what we are dealing with. That does not mean we should give up the attempt. It is the striving that enables us to achieve a better understanding, but only as long as it is imbued with a tactful recognition of the limits to human understanding. The rest is hubris.
If it could eventually be shown definitively that the two major ways, not just of thinking, but of being in the world, are not related to the two cerebral hemispheres, I would be surprised, but not unhappy. Ultimately what I have tried to point to is that the apparently separate ‘functions’ in each hemisphere fit together intelligently to form in each case a single coherent entity; that there are, not just currents here and there in the history of ideas, but consistent ways of being that persist across the history of the Western world, that are fundamentally opposed, though complementary, in what they reveal to us; and that the hemispheres of the brain can be seen as, at the very least, a metaphor for these. One consequence of such a model, I admit, is that we might have to revise the superior assumption that we understand the world better than our ancestors, and adopt a more realistic view that we just see it differently – and may indeed be seeing less than they did.
Iain McGilchrist, The Master and His Emissary: The Divided Brain and the Making of the Western World.
Let’s [address] the question of how humans acquired music and language, since it helps us to understand the revolutionary power of imitation. Music and language are skills, and skills are not like physical attributes – bigger wings, longer legs: not only can they be imitated, which obviously physical characteristics on the whole can’t, but in the case of music and language they are reciprocal skills, of no use to individuals on their own, though of more than a little use to a group. An account of the development of skills such as language purely by the competitive force of classical natural selection has to contend not only with the fact that the skills could easily be mimicked by those not genetically related, thus seriously eroding the selective power in favour of the gene, but also with the fact that unless they were mimicked they wouldn’t be much use. Imitation would itself have a selective advantage: it would enable those who were skilled imitators to strengthen the bonds that tied them to others within the group, and make social groups stable and enduring. Those groups that were most cohesive would survive best, and the whole group’s genes would do better, or not, depending on the acquisition of shared skills that promote bonding – such as music, or ultimately language. Those individuals less able to imitate would be less well bound into the group, and would not prosper to the same degree.
The other big selective factor in acquiring skills and fitting in with the group would be flexibility, which comes with expansion of the frontal lobes – particularly the right frontal lobe, which is also the seat of social intelligence. Skills are intuitive, ‘inhabited’ ways of being and behaving, not analytically structured, rule-based techniques. So it may be that we were selected – not for specific abilities, with specific genes for each, such as the ‘language gene(s)’ or the ‘music gene(s)’ – not even ‘group selected’ for such genes – but individually for the dual skills of flexibility and the power to mimic, which are what is required to develop skills in general.
From a philosophical perspective, the discovery of mirror neurons is exciting because it gave us an idea of how motor primitives could have been used as semantic primitives: that is, how meaning could be communicated between agents. Thanks to our mirror neurons, we can consciously experience another human being’s movements as meaningful.Perhaps the evolutionary precursor of language was not animal calls but gestural communication. The transmission of meaning may initially have grown out of the unconscious bodily self-model and out of motor agency, based, in our primate ancestors, on elementary gesturing. Sounds may only later have been associated with gestures, perhaps with facial gestures—such as scowling, wincing, or grinning—that already carried meaning. Still today, the silent observation of another human being grasping an object is immediately understood, because, without symbols or thought in between, it evokes the same motor representation in the parieto-frontal mirror system of our own brain. As Professor Rizzolatti and Dr. Maddalena Fabbri Destro from the Department of Neuroscience at the University of Parma put it: “[T]he mirror mechanism solved, at an initial stage of language evolution, two fundamental communication problems: parity and direct comprehension. Thanks to the mirror neurons, what counted for the sender of the message also counted for the receiver. No arbitrary symbols were required. The comprehension was inherent in the neural organization of the two individuals.”
Such ideas give a new and rich meaning not only to the concepts of “grasping” and “mentally grasping the intention of another human being,” but, more important, also to the concept of grasping a concept—the essence of human thought itself. It may have to do with simulating hand movements in your mind but in a much more abstract manner. Humankind has apparently known this for centuries, intuitively: “Concept” comes from the Latin conceptum, meaning “a thing conceived,” which, like our modern “to conceive of something,” is rooted in the Latin verb concipere, “to take in and hold.” As early as 1340, a second meaning of the term had appeared: “taking into your mind.” Surprisingly, there is a representation of the human hand in Broca’s area, a section of the human brain involved in language processing, speech or sign production, and comprehension. A number of studies have shown that hand/arm gestures and movements of the mouth are linked through a common neural substrate. For example, grasping movements influence pronunciation— and not only when they are executed but also when they are observed. It has also been demonstrated that hand gestures and mouth gestures are directly linked in humans, and the oro-laryngeal movement patterns we create in order to produce speech are a part of this link.
Broca’s area is also a marker for the development of language in human evolution, so it is intriguing to see that it also contains a motor representation of hand movements; here may be a part of the bridge that led from the “body semantics” of gestures and the bodily self-model to linguistic semantics, associated with sounds, speech production, and abstract meaning expressed in our cognitive self-model, the thinking self. Broca’s area is present in fossils of Homo habilis, whereas the presumed precursors of these early hominids lacked it. Thus the mirror mechanism is conceivably the basic mechanism from which language evolved. By providing motor copies of observed actions, it allowed us to extract the action goals from the minds of other human beings—and later to send abstract meaning from one Ego Tunnel to the next.
The mirror-neuron story is attractive not only because it bridges neuroscience and the humanities but also because it illuminates a host of simpler social phenomena. Have you ever observed how infectious a yawn is? Have you ever caught yourself starting to laugh out loud with others, even though you didn’t really understand the joke? The mirror-neuron story gives us an idea of how groups of animals—fish schools, flocks of birds—can coordinate their behavior with great speed and accuracy; they are linked through something one might call a low-level resonance mechanism. Mirror neurons can help us understand why parents spontaneously open their mouths while feeding their babies, what happens during a mass panic, and why it is sometimes hard to break away from the herd and be a hero. Neuroscience contributes to the image of humankind: We are all connected in an intersubjective space of meaning—what Vittorio Gallese calls a “shared manifold.”
Thomas Metzinger, The Ego Tunnel: The Science of The Mind and The Myth of The Self.
There is no nature, only Nature – an imaginary state of man’s own invention, a realm of concept and language. That is man’s place and it is nowhere except inside his head; a mirror image of a distorted fantasy he calls Mankind. A distortion of a distortion, exponentially phantasmagorical. Nature is a conceit: a man-made garden in which we wander to relax and preen, as we nod to one another in passing, and congratulate ourselves on being us. We created Nature so that we might take pride in how far we have ventured beyond it.
Man has no place in nature because there is no nature: only what he makes. He is therefore beyond nothing. He is merely self-deceived. Forever trapped inside his self-inflated dream of what he is. A pathetic child imagining himself in the world, when, in reality, he is confined by the four walls of his playroom. His ‘world’ being nothing more than the arrangement of his diminutive models and playthings.
Man is exiled from the real world, from nature, by language. He is the willing prisoner of words. All his high-mindedness, his ideals, morality, stemming merely from the necessity of language. True nature cares for nothing, neither life nor death. It is simply in a perpetual motion of growth and decay, beyond value or morality. Lacking the curse of consciousness and the petty ethics that entails, the natural world lives and dies blindly, without intention, regenerates or doesn’t. There is no system, only a multiplicity of life cycles; parts that remain seperate, that never add up to a whole. Nature does not do arithmetic. Man is one of a myriad of dissociated parts, not outside observer of an illusory unity.
If he tears down the forests or fights for their preservation, he does it for himself. It is of no consequence to nature, whose disparate parts survive or don’t, without sensibility. The ‘ecosystem’ is man’s vision of where he is and, in reality, no system at all. The environment is his own orderly invention, his realm, but the environment cares neither for its own death nor man’s. Nor does it care for man’s care for it. Man makes a lapdog of a planet in which he is merely a passing formulation of life: the current arrangement of molecules. His continued existence, and that of the planet itself, is of no importance to anything other than a few temporary particles that are our species.
Jenny Diski, Rainforest.
In a time and in a country where everyone goes out of his way to announce opinions or hand down judgements, Mr Palomar has made a habit of biting his tongue three times before asserting anything. After the bite, if he is still convinced of what he was going to say, he says it. If not, he keeps his mouth shut. In fact, he spends whole weeks, months in silence.
Good opportunities for keeping quiet are never in short supply, but there are also rare occasions when Mr Palomar regrets not having said something he could have said at the right moment. He realizes that events have confirmed what he was thinking and if he had expressed his thoughts at the time, he would have had a positive influence, however slight, on what then ensued. In these cases his spirit is torn between self-satisfaction for having seen things properly and a sense of guilt because of his excessive reserve. Both feelings are so strong that he is tempted to put them into words; but after having bitten his tongue three times, or rather six, he is convinced he has no cause either for pride or remorse.
Having had the correct view is nothing meritorious: statistically, it is almost inevitable that among the many cockeyed, confused or banal ideas that come into his mind, there should also be some perspicacious ideas, even ideas of genius; and as they occurred to him, they can surely have occurred also to somebody else.
Opinion on his having refrained from expressing his idea is more open to debate. In times of general silence, conforming to the silence of the majority is certainly culpable. In times when everybody says too much, the important thing is not merely to say what is right, which in any event would be lost in the flood of words, but to say it on the basis of premisses, suggesting also consequences, so that what is said acquires the maximum value. But then, if the value of a single affirmation lies in the continuity and coherence of the discourse in which it is uttered, the only possible choice is between speaking continuously or never speaking at all. In the first case Mr Palomar would reveal that his thinking does not proceed in a straight line but zigzags its way through vacillations, denials, corrections, in whose midst the rightness of that affirmation of his would be lost. As for the other alternative, it implies an art of keeping silent even more difficult than the art of speaking.
In fact, silence can also be considered a kind of speech, since it is a rejection of the use to which others put words; but the meaning of this silent speech lies in its interruptions, in what is, from time to time, actually said, giving a meaning to what is unsaid.
Or rather: a silence can serve to dismiss certain words or else to hold them in reserve for use on a better occasion. Just as a word spoken now can save a hundred words tomorrow or else can necessitate the saying of another thousand. “Every time I bite my tongue,” Mr Palomar concludes mentally, “I must think not only of what I am about to say or not to say, but also of everything that, whether I say it or do not say it, will be said or not said by me or by others.” Having formulated this thought, he bites his tongue and remains silent.
Italo Calvino, Mr Palomar.