Gold does not harmonize with the character of our goods. Gold and straw, gold and petrol, gold and guano, gold and bricks, gold and iron, gold and hides! Only a wild fancy, a monstrous hallucination, only the doctrine of “value” can bridge the gulf. Commodities in general, straw, petrol, guano and the rest can be safely exchanged only when everyone is indifferent as to whether he possesses money or goods, and that is possible only if money is afflicted with all the defects inherent in our products. That is obvious. Our goods rot, decay, break, rust, so only if money has equally disagreeable, loss-involving properties can it effect exchange rapidly, securely and cheaply. For such money can never, on any account, be preferred by anyone to goods.
Only money that goes out of date like a newspaper, rots like potatoes, rusts like iron, evaporates like ether, is capable of standing the test as an instrument for the exchange of potatoes, newspapers, iron, and ether. For such money is not preferred to goods either by the purchaser or the seller. We then part with our goods for money only because we need the money as a means of exchange, not because we expect an advantage from possession of the money.
Silvio Gesell, The Natural Economic Order.
It is no accident that ancient Greece, the place where symbolic money originated, also gave birth to the modern conception of the individual, to the notions of logic and reason, and to the philosophical underpinnings of the modern mind. In his scholarly masterpiece Money and the Ancient Greek Mind, classics professor Richard Seaford explores the impact of money on Greek society and thought, illuminating the characteristics that make money unique. Among them are that it is both concrete and abstract, that it is homogeneous, impersonal, a universal aim, and a universal means, and that it is unlimited. The entrance of this new, unique power into the world had profound consequences, many of which are now so deeply woven into our beliefs and culture, psyche and society, that we can barely perceive them, let alone question them.
Money is homogeneous in that regardless of any physical differences among coins, coins qua money are identical (if they are of the same denomination). New or old, worn or smooth, all one drachma coins are equal. This was something new in the sixth century BCE. Whereas in archaic times, Seaford observes, power was conferred by unique talismanic objects (e.g., a scepter said to be handed down from Zeus), money is the opposite: its power is conferred by a standard sign that wipes out variations in purity and weight. Quality is not important, only quantity. Because money is convertible into all other things, it infects them with the same feature, turning them into commodities— objects that, as long as they meet certain criteria, are seen as identical. All that matters is how many or how much. Money, says Seaford, “promotes a sense of homogeneity among things in general.” All things are equal, because they can be sold for money, which can in turn be used to buy any other thing.
In the commodity world, things are equal to the money that can replace them. Their primary attribute is their “value”—an abstraction. I feel a distancing, a letdown, in the phrase, “You can always buy another one.” Can you see how this promotes an anti-materialism, a detachment from the physical world in which each person, place, and thing is special, unique? No wonder Greek philosophers of this era began elevating the abstract over the real, culminating in Plato’s invention of a world of perfect forms more real than the world of the senses. No wonder to this day we treat the physical world so cavalierly. No wonder, after two thousand years’ immersion in the mentality of money, we have become so used to the replaceability of all things that we behave as if we could, if we wrecked the planet, simply buy a new one.
I named this chapter “Money and the Mind.” Very much like the fiduciary value of money, mind is an abstraction riding a physical vehicle. Like monetary fiduciarity, the idea of mind as a separate, non-material essence of being developed over thousands of years, leading to the modern concept of an immaterial consciousness, a disembodied spirit. Tellingly, in both secular and religious thought, this abstraction has become more important than the physical vehicle, just as the “value” of a thing is more important than its physical attributes.
One manifestation of this spirit-matter split that gives primacy to the former is the idea, “Sure, economic reform is a worthy cause, but what is much more important is a transformation of human consciousness.” I think this view is mistaken, for it is based on a false dichotomy of consciousness and action, and ultimately of spirit and matter. On a deep level, money and consciousness are intertwined. Each is bound up in the other.
The development of monetary abstraction fits into a vast meta-historical context. Money could not have developed without a foundation of abstraction in the form of words and numbers. Already, number and label distance us from the real world and prime our minds to think abstractly. To use a noun already implies an identity among the many things so named; to say there are five of a thing makes each a unit. We begin to think of objects as representatives of a category, and not unique beings in themselves. So, while standard, generic categories didn’t begin with money, money vastly accelerated their conceptual dominance. Moreover, the homogeneity of money accompanied the rapid development of standardized commodity goods for trade. Such standardization was crude in pre-industrial times, but today manufactured objects are so nearly identical as to make the lie of money into the truth.
Money as a universal aim is embedded in our language. We speak of “capitalizing” on our ideas and use “gratuitous,” which literally means received with thanks (and not payment), as a synonym for unnecessary. It is embedded in economics to be sure, in the assumption that human beings seek to maximize a self-interest that is equivalent to money. It is even embedded in science, where it is a cipher for reproductive self-interest. Here, too, the notion of a universal aim has taken hold.
That there is even such a thing as a universal aim to life (be it money or something else) is not at all obvious. This idea apparently arose at about the same time money did; perhaps it was money that suggested it to philosophers. Socrates used a money metaphor explicitly in proposing intelligence as universal aim: “There is only one right currency for which we ought to exchange all these other things [pleasures and pains]—intelligence.” In religion this corresponds to the pursuit of an ultimate aim, such as salvation or enlightenment, from which all other good things flow. How like the unlimited aim of money! I wonder what the effect would be on our spirituality if we gave up on the pursuit of a unitary, abstract goal that we believe to be the key to everything else. How would it feel to release the endless campaign to improve ourselves, to make progress toward a goal? What would it be like just to play instead, just to be? Like wealth, enlightenment is a goal that knows no limit, and in both cases the pursuit of it can enslave. In both cases, I think that the object of the pursuit is a spurious substitute for a diversity of things that people really want.
In a fully monetized society, in which nearly everything is a good or a service, money converts the multiplicity of the world into a unity, a “single thing that is the measure of, and exchangeable with, almost anything else.” The apeiron, the logos, and similar conceptions were all versions of an underlying unity that gives birth to all things. It is that from which all things arise and to which all things return. As such it is nearly identical with the ancient Chinese conception of the Tao, which gives birth to yin and yang, and then to the ten thousand things. Interestingly, the semi-legendary preceptor of Taoism, Lao Tzu, lived at approximately the same time as the pre-Socratic philosophers —which is also more or less the time of the first Chinese coinage. In any event, today it is still money that gives birth to the ten thousand things. Whatever you want to build in this world, you start with an investment, with money. And then, when you have finished your project, it is time to sell it. All things come from money; all things return to money.
Unlike physical goods, the abstraction of money allows us, in principle, to possess unlimited quantities of it. Thus it is easy for economists to believe in the possibility of endless exponential growth, where a mere number represents the size of the economy. The sum total of all goods and services is a number, and what limit is there on the growth of a number? Lost in abstraction, we ignore the limits of nature and culture to accommodate our growth. Following Plato, we make the abstraction more real than the reality, fixing Wall Street while the real economy languishes. The monetary essence of things is called “value,” which, as an abstracted, uniform essence, reduces the plurality of the world. All things are reduced to what they are worth. This gives the illusion that the world is as limitless as numbers are. For a price, you can buy anything.
Charles Eisenstein, Sacred Economics: Money, Gift and Society in the Age of Transition
When pre-experimental man conceived of the unknown as an ambivalent mother, he was not indulging in childish fantasy. He was applying what he knew to what was unfamiliar, but could not be ignored. Man’s first attempts to describe the unknown cannot be faulted because they lacked empirical validity. Man was not originally an empirical thinker. This does not mean he was self-deluded, a liar. Likewise, when the individual worships the hero, he is not necessarily hiding from reality. It may also be that he is ready and willing to face the unknown, as an individual; that he is prepared to adopt the pattern of heroic endeavour in his own life, and to further creation in that manner.
The great myths of Christianity – the great myths of the past, in general – no longer speak to the majority of westerners, who regard themselves as educated. The mythic view of history cannot be credited with reality, from the material, empirical point of view. It is nonetheless the case that all of western ethics, including those explicitly formalized in western law, are predicated upon a mythological world-view, which specifically attributes divine status to the individual. The modern individual is therefore in a unique position: he no longer believes that the principles upon which all his behaviors are predicated are valid. This might be considered a second fall, in that the destruction of the western mythological barrier has re-exposed the essential tragedy of individual existence to view.
It is not the pursuit of empirical truth, however, that has wreaked havoc upon the Christian worldview: it is confusion of empirical fact with moral truth, to the great detriment of the latter. This confusion has produced what might be described as a secondary gain, which has played an important role in maintaining the confusion. That gain is abdication of the absolute personal responsibility imposed in consequence of recognition of the divine in man. This responsibility means acceptance of the trials and tribulations associated with expression of unique individuality, as well as respect for such expression in others. Such acceptance, expression and respect requires courage in the absence of certainty, and discipline in the smallest matters.
Rejection of moral truth allows for rationalization of cowardly, destructive, degenerate self-indulgence. This is one of the most potent attractions of such rejection, and constitutes primary motivation for the lie. The lie, above all else, threatens the individual – and the interpersonal. The lie is predicated upon the presupposition that the tragedy of individuality is unbearable – that human experience itself is evil. The individual lies because he is afraid – and it is not the lies he tells another that present the clearest danger, but the lies he tells himself. The root of social and individual psychopathology, the “denial,” the “repression” – is the lie. The most dangerous lie of all is devoted towards denial of individual responsibility – towards denial of individual divinity.
The idea of the divine individual took thousands of years to fully develop, and is still constantly threatened by direct attack and insidious counter-movement. It is based upon realization that the individual is the locus of experience. All that we can know about reality we know through experience. It is therefore simplest to assume that all there is of reality is experience, in being and progressive unfolding. Furthermore, it is the subjective aspect of individuality – of experience – that is divine, not the objective. Man is an animal, from the objective viewpoint, worthy of no more consideration than the opinion and opportunities of the moment dictate. From the mythic viewpoint, however, every individual is unique – is a new set of experiences, a new universe; has been granted the ability to bring something new into being; is capable of participating in the act of creation itself. It is the expression of this capacity for creative action that makes the tragic conditions of life tolerable, bearable – remarkable, miraculous.
The paradise of childhood is absolute meaningful immersion. That immersion is a genuine manifestation of subjective interest. Interest accompanies the honest pursuit of the unknown, in a direction and at a rate subjectively determined. The unknown, in its beneficial guise, is the ground of interest, the source of what matters. Culture, in its supportive role, extends the power with which the unknown can be met, by disciplining the individual and expanding his range of ability. In childhood, the parent serves as cultural surrogate, and the child explores under the umbrella of protection provided by his parents. The parental mechanism has its limits, however, and must be superseded by the internalization of culture – by the intrapsychic incorporation of belief, security, and goal. Adoption of this secondary protective structure dramatically extends and shapes individual capability.
The dragon limits the pursuit of individual interest. The struggle with the dragon – against the forces that devour will and hope – constitutes the heroic battle, in the mythological world. Faithful adherence to the reality of personal experience ensures contact with the dragon – and it is during such contact that the great force of the individual spirit makes itself manifest, if it is allowed to. The hero voluntarily places himself in opposition to the dragon. The liar pretends that the great danger does not exist, to his peril and to that of others, or abdicates his relationship with his essential interest, and abandons all chance at further development.
Interest is meaning. Meaning is manifestation of the divine individual adaptive path. The lie is abandonment of individual interest – hence meaning, hence divinity – for safety and security; is sacrifice of the individual to appease the Great Mother and Great Father.
Jordan B. Peterson, Maps of Meaning: The Architecture of Belief
Patterns in language yield patterns in thought. Extensive research has now demonstrated that differences between languages can yield differences, often subtle ones, in the cognitive habits of their speakers. This finding, commonly referred to as linguistic relativity, has now been supported by dozens of studies on topics like spatial awareness, the perceptions of time, and the categorisation of colours. For instance, “where” the future and past “are” depends on the language you speak. Similarly, the manner in which you recall and discriminate colours is affected in sublte ways by the basic colour term inventory of your native language. Our tour of the numberless worlds ultimately led to the conclusion that numeric language also yields difference in how people think. Number words, present in the vast majority of the world’s languages (though not all of them), certainly influence quantitative cognition. Only those people who are familiar with number words and counting can exactly differentiate most quantities. The presence of numbers in a language does not just subtly influence how we think about certain quantities, then; it also opens up a door to the world of arithmetic and mathematics. The first step through that door is the realisation that quantities, regardless of size, can be precisely differentiated. But how exactly do numbers first open this door? And what happens after we walk through it?
The findings from numberless worlds suggests plainly that we need numbers to really “get” quantities in ways that are uniquely human, but, this raises a paradox. If we need numbers to appreciate most quantities precisely, how did we get numbers in the first place? How could we ever name the amounts in particular sets of items, if we could not recognise the amount?
Given the apparent intractability of this paradox, some have concluded that humans must be innately predisposed to acquire number concepts. But, if we are predisposed to recognise different set sizes as separate abstract entities, then what is the limit to this predisposition? Are we naturally predisposed, for example, to eventually realise that 1,023 is not 1,024? This seems fairly implausible. Framed differently, nativist views on numbers just delay the point at which we reach the paradox.
James Hurford noted that number words are names for the “non-linguistic entities denoted by numbers.” That is, the number words label conceptual entities. In a related vein, Karenleigh Overmann recently suggested that “quantity concepts must surely precede their lexical labels, or there would be nothing to name… A method of invention cannot presuppose that which it invents.” This latter stance is understandable, but it arguably trivialises the extensive evidence, according to which, words for quantities beyond three do not simply label pre-existing concepts, because these concepts do not exist for most people until they actually learn numbers.
In my view, this is the key to resolving the paradox: words for quantities beyond three make concrete the precise numerical abstractions that are only occasionally and inconsistently made by some people. Some of these people may eventually invent numbers, but if they do not, their fleeting abstractions are not transferred to others. The naming of such ephemeral realisations is what eventually enables people to consistently show the ability to make a simple but powerful realisation, the realisation that sets of quantities greater than three can be identified precisely. This simple realisation has led, in all likelihood more times than could be documented, to the invention of symbols for such larger quantities. These symbols are chiefly verbal in nature, judging from the fact that the overwhelming majority of the world’s cultures have words for such quantities though most cultures traditionally lack written numerals or elaborate tally systems. Some people invented number words to concretise the potentially transient recognition of the existence of exact higher quantities.
Does this mean that number words simply serve as labels for the concepts? Not really. The truth seems a bit more nuanced than the forced dichotomous choice assumed by the paradox. Number words are not simply labels, yet they do describe conceptual realisations that some people make some times. The term ‘label’ implies that the words simply denote concepts that we all think about: concepts all humans are born ready to appreciate (at least eventually), regardless of their cultural environment. But clearly not all humans have such concepts at the ready even as adults, and likely most people would never make the relevant realisations that can be described via numbers. Just as clearly, though, some people have made those realisations, even if inconsistently. In those real historical cases in which people managed to describe that realisation with a word, they invented numbers. The concept they named was subsequently recognised by other members of their culture through the adoption of the relevant word(s). Number words are conceptual tools that get passed around with ease, tools most people want to borrow.
Caleb Everett, Numbers and the Making of us: Counting and the Course of Human Cultures
In the field of religion there are dogmatists of no-faith as there are of faith, and both seem to me closer to one another than those who try to keep the door open to the possibility of something beyond the customary ways in which we think, but which we would have to find, painstakingly, for ourselves. Similarly as regards science, there are those who are certain, God knows how, of what it is that patient attention to the world reveals, and those who really do not care, because their minds are already made up that science cannot tell them anything profound. Both seem to me profoundly mistaken. Though we cannot be certain what it is our knowledge reveals, this is in fact a much more fruitful position – in fact the only one that permits the possibility of belief. And what has limited the power of both art and science in our time has been the absence of belief in anything except the most diminished version of the world and our selves. Certainty is the greatest of all illusions: whatever kind of fundamentalism it may underwrite, that of religion or of science, it is what the ancients meant by hubris. The only certainty, it seems to me, is that those who believe they are certainly right are certainly wrong. The difference between scientific materialists and the rest is only this: the intuition of the one is that mechanistic application of reason will reveal everything about the world we inhabit, where the intuition of the others leads them to be less sure. Virtually every great physicist of the last century – Einstein, Bohr, Planck, Heisenberg, Bohm, amongst many others – has made the same point. A leap of faith is involved, for scientists as much as anyone. According to Max Planck, ‘Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: Ye must have faith. It is a quality which the scientist cannot dispense with.’ And he continued: ‘Science cannot solve the ultimate mystery of nature. And that is because, in the last analysis, we ourselves are part of nature and therefore part of the mystery that we are trying to solve.
In this book certainty has certainly not been my aim. I am not so much worried by the aspects that remain unclear, as by those which appear to be clarified, since that almost certainly means a failure to see clearly. I share Wittgenstein’s mistrust of deceptively clear models: and, as Waismann said, ‘any psychological explanation is ambiguous, cryptic and open-ended, for we ourselves are many-layered, contradictory and incomplete beings, and this complicated structure, which fades away into indeterminacy, is passed on to all our actions.’ I am also sympathetic to those who think that sounds like a cop-out. But I do think that things as they exist in practice in the real world, rather than as they exist in theory in our re-presentations, are likely to be intrinsically resistant to precision and clarification. That is not our failure, but an indication of the nature of what we are dealing with. That does not mean we should give up the attempt. It is the striving that enables us to achieve a better understanding, but only as long as it is imbued with a tactful recognition of the limits to human understanding. The rest is hubris.
If it could eventually be shown definitively that the two major ways, not just of thinking, but of being in the world, are not related to the two cerebral hemispheres, I would be surprised, but not unhappy. Ultimately what I have tried to point to is that the apparently separate ‘functions’ in each hemisphere fit together intelligently to form in each case a single coherent entity; that there are, not just currents here and there in the history of ideas, but consistent ways of being that persist across the history of the Western world, that are fundamentally opposed, though complementary, in what they reveal to us; and that the hemispheres of the brain can be seen as, at the very least, a metaphor for these. One consequence of such a model, I admit, is that we might have to revise the superior assumption that we understand the world better than our ancestors, and adopt a more realistic view that we just see it differently – and may indeed be seeing less than they did.
Iain McGilchrist, The Master and His Emissary: The Divided Brain and the Making of the Western World.
Let’s [address] the question of how humans acquired music and language, since it helps us to understand the revolutionary power of imitation. Music and language are skills, and skills are not like physical attributes – bigger wings, longer legs: not only can they be imitated, which obviously physical characteristics on the whole can’t, but in the case of music and language they are reciprocal skills, of no use to individuals on their own, though of more than a little use to a group. An account of the development of skills such as language purely by the competitive force of classical natural selection has to contend not only with the fact that the skills could easily be mimicked by those not genetically related, thus seriously eroding the selective power in favour of the gene, but also with the fact that unless they were mimicked they wouldn’t be much use. Imitation would itself have a selective advantage: it would enable those who were skilled imitators to strengthen the bonds that tied them to others within the group, and make social groups stable and enduring. Those groups that were most cohesive would survive best, and the whole group’s genes would do better, or not, depending on the acquisition of shared skills that promote bonding – such as music, or ultimately language. Those individuals less able to imitate would be less well bound into the group, and would not prosper to the same degree.
The other big selective factor in acquiring skills and fitting in with the group would be flexibility, which comes with expansion of the frontal lobes – particularly the right frontal lobe, which is also the seat of social intelligence. Skills are intuitive, ‘inhabited’ ways of being and behaving, not analytically structured, rule-based techniques. So it may be that we were selected – not for specific abilities, with specific genes for each, such as the ‘language gene(s)’ or the ‘music gene(s)’ – not even ‘group selected’ for such genes – but individually for the dual skills of flexibility and the power to mimic, which are what is required to develop skills in general.
We sometimes think, and even like to think, that the two greatest exertions that have influenced mankind, religion and science, have always been historical enemies, intriguing us in opposite directions. But this effort at special identity is loudly false. It is not religion but the church and science that were hostile to each other. And it was rivalry, not contravention. Both were religious. They were two giants fuming at each other over the same ground. Both proclaimed to be the only way to divine revelation.
It was a competition that first came into absolute focus with the late Renaissance, particularly in the imprisonment of Galileo in 1633. The stated and superficial reason was that his publications had not been first stamped with papal approval. But the true argument, I am sure, was no such trivial surface event. For the writings in question were simply the Copernican heliocentric theory of the solar system which had been published a century earlier by a churchman without any fuss whatever. The real division was more profound and can, I think, only be understood as a part of the urgency behind mankind’s yearning for divine certainties. The real chasm was between the political authority of the church and the individual authority of experience. And the real question was whether we are to find our lost authorization through an apostolic succession from ancient prophets who heard divine voices, or through searching the heavens of our own experience right now in the objective world without any priestly intercession. As we all know, the latter became Protestantism and, in its rationalist aspect, what we have come to call the Scientific Revolution.