The Brick Wall of Washing Machines

People probably make too much fuss about defining biological sex in terms of its organic components. The term “chromosomes” gets thrown about, maybe because it is commonly used in basic biology education and is consequently a bit more accessible than “gametes,” although gametes are in fact the heart of the matter. Several different chromosomal combinations exist in humans (as abnormalities) besides XX and XY, but gametes come in only two forms – sperm and ova, the component factors of sexual reproduction.

But why does sexual reproduction itself exist, and by extension, why do the two sexes themselves? It is not a given across all species. Quite a few species of plants and some unicellular organisms practise autogamous fertilisation, effectively a slightly modified form of cloning in which the variants of sex are applied to an otherwise identical genetic template. Others, like the New Mexico whiptail, are parthenogenic, meaning that females can produce more females (clones) with no fertilisation at all. Most often this manifests as a “fail safe,” in species such as the Komodo monitor, for environments with a shortage of males. Obligate parthenogens are rare. When it happens, it tends to be the result of an unusually torpid environment combined with some kind of recent fuck-up. In the case of the obligately parthenogenic New Mexico whiptail: it lives primarily in the desert and owes its existence to cross-breeding between two parent lizard species which cannot produce viable males. If its environment changes too much, it is fucked: cloning and autogamy place a hard limit on gene recombination, and therefore adaptation, which is why the latter really only exists in plants and invertebrates, and the dominant presentation of the former is as a “failure mode” in otherwise sexually reproducing species. It is only “practical” in species with extraordinarily high reproductive potential, short gestation periods, sedate or undemanding environments, low metabolic needs, or high mutation rates.

Given this, it is not hard to see where males and females came from. Think of The Sexes™ as a strategy of gene propagation, and then secondary sex differences, in morphology and psychology, as strategies which reflect the different selective pressures the sexes were subjected to and/or subjected each other to (dimorphism). Viewed through this lens, females represent the “default” strategy which began with the oldest organisms (e.g. asexual bacteria): the “incubators,” reproducing through cloning and self-fertilisation, whereas males, the “fertilisers,” are a comparatively recent innovation. The degree of “sex-differentiation load” that falls upon males varies by species according to the aforestated variables in selection. Since females are, as is often noted, the gatekeepers of reproduction, the selection pressures that act primarily on females tend to be similar across species and relate, directly or obliquely, to their ability to bear offspring. For males, the story revolves around the conditions of access to females, which is why the male sex “morph” (form) differentiates itself from the female in completely different ways across species.

Sometimes male and female are barely distinguishable from one another. This is the case for many monogamous avians, whose environments, for whatever reason, do not lend themselves to significant sexual differentiation, which reduces female choosiness, which limits dimorphism: it is a negative feedback system. Other birds, like the crested auklet, engage in a kind of mutually eliminative sexual selection, whereby each sex vets the other for organically expensive sexual ornaments for reasons that are not well understood. In elephant seals, the degree of sex differentiation, just in size, borders on the absurd, although their (relative to humans) feeble brains mean that the possible scope of behavioural differentiation is not all that striking most of the time. Exactly where humans “fit” on these continua of male sex differentiation is something of a relative judgement call, but we are obviously not auklets or crows.

Sexual dimorphism and monomorphism have special behavioural correlates, most of which are obvious. Monomorphic species tend to be monogamous with fairly equal parental investment in offspring and low variance in male reproductive success. Dimorphics tend towards, well, the opposite of those traits. Humans also have a lengthier post-reproductive schedule than most animals, largely because of how long it takes the human brain to develop, which probably limits sex differentiation in e.g. aggression compared with some species that practise effective polygyny, and different normative mating systems between human societies will also affect it notwithstanding other forces such as judicially enforced genetic pacification. There is also considerable variation in these “life history traits” through time: from a time when “childhood” was seldom acknowledged as its own entity and children were expected to be responsible, to the point of execution, for criminal wrongdoing from an extremely young age, to … whatever you would call the situation we have now. Certain kinds of change may be inevitable, in this respect. Other things are remarkably changeless even in the face of new environments.

Human sexual dimorphism is an example of this changelessness. If aliens were to observe the human sexes 100 years ago and now, they would note stability in a range of male and female responses to exogenous stimuli, and note the differences in underlying strategy. Males are the strategy of high risk, aggression, dominance, status-seeking, agency and systems orientation; females are the strategy of low risk, passive aggression, emotional dominance, comfort-seeking, agency by proxy, and social orientation. (A great example of the agency/agency by proxy distinction can be seen in sex-specific antisocial behaviours such as psychopathy in males and Briquet’s syndrome in females.) They would note that human females are the limiting factor in reproduction, but human males are the limiting factor in just about everything else (obligatory Paglia quote about living in grass huts, etc). Intelligence is probably not a sexually selected trait in humans, or at least, there is little good evidence for it, and sex differences in intelligence per se are trivial. The sex difference is in application. Human brain complexity and its antecedents mean that the domain of activities germane to preserving one’s genetic line are rather more elaborate than normal, and since females are the “selector” sex, those tasks, and selection for assiduous task-doing, are upon the males.

There is no real sense in which human beings can “escape” natural selection, because natural selection is the reason behind everything that we are, including the desire (of some) to “overcome” natural selection, whatever that means. However, natural selection has also given us moral instincts and reasoning abilities which, combined with the technologies born mostly of male ingenuity, could allow us to divert evolutionary selection pressures in a way that could never happen without our technology. The crapshoot of genetic recombination, by the lights of human morality, is just that: a crapshoot. At some point, artificial gametogenesis could allow humans to become effective hermaphrodites, even if we still have the old equipment. CRISPR, and eventually full genome synthesis, could render natural recombination processes obsolete, and therefore sexual reproduction itself obsolete. Childhood will increasingly resemble adulthood as we produce children of extremely superior intelligence, and thus, reduce the need for high investment. Male breadwinning social roles will run into a brick wall of automation, or perhaps cloning of the 99.999th percentile most workaholic and intelligent workers. Female homemaking roles will (or have?) run into a brick wall of washing machines. As technology outpaces our obsolescent biological hardware, one seriously has to wonder: how much of the human intersexual dynamic, i.e. behavioural sexual dimorphism, is worth preserving? Maybe we could do with being more like the monomorphic crows.

Alternatively, perhaps one imagines a world of nearly infinite morphological freedom where individuals can modify their own physiology and psychology with ease, unconstrained by sex, like character profiles in an RPG, and where sex and gender, insomuch as they exist, amount to little more than fashion. One may dream.

A World of Trauma – Civilizational Psychosadomasochism and Emptiness

According to Google’s vast textual corpora, there was nary an instance of the term “trauma,” or its distinctly psychiatric derivative “traumatized,” in written English prior to the 1880s. The first usage of “trauma” is documented in the 1690s, at which point it referred to physical wounding only. Its “psychic wound” sense did not pick up until the tail end of the 19th century, which is now far more familiar to us than the original sense. Exactly what took root in the world between then and now? The standard narrative is that the medical profession became wiser, but what of the wisdom embedded in our species’ genetic history? Note that even most doctors and biomedical lab technicians know little of basic genetics, or, one has to assume, of evolutionary reasoning. I recall being sneeringly told by one, on introducing her to the concept, that she was only interested in “proper science.” This is about when it set in that even many “grunt-work” scientists are basically morons. She certainly was.

Applying the principles of natural selection (i.e. evolutionary reasoning) to find the aetiology of disease tends to yield different answers from those that are now fashionable. In a 2000 paper, “Infectious Causation of Disease: An Evolutionary Perspective,” the authors compellingly argue that a huge number of supposedly mysterious illnesses are in fact caused by pathogens – bacteria or viruses. The argument is simple: any genetic endowment which essentially zeroes fitness (reproductive potential) can be maintained in a population’s genes only at the basal rate of errors, i.e. mutations, in the genetic code, with the apparently sole exception of heterozygote advantage for protection against malaria. Thus, anything so destructive which rises above a certain threshold of prevalence should arouse suspicion that a pathogen is to blame. This would include schizophrenia, an alleged evolutionary “paradox,” with a prevalence of ~0.5%, especially since, unlike “psychopathy,” schizophrenia has low twin-concordance, low heritability, and is discontinuous with normal personality. At present, direct evidence of the pathogen is scant, but that is to be expected: viruses are tricksy. No other explanation is plausible.

What, then, when one turns the evolutionary lens towards “trauma”? What is commonly called psychological trauma can helpfully be divided into two categories: non-chronic and chronic. The former is what most people would call distress. It is adaptive to have unpleasant memories of situations that could kill you or otherwise incur significant reproductive costs, which is why everyone feels this. It is good to have unpleasant memories of putting one’s hand on an electric fence for this reason. It is bad, and certainly not evolutionarily adaptive, for the memory to continually torture you for years after the fact. I have it on good authority that this does nothing to attract mates, for example.

In light of this, it becomes clearer what may be behind the apparent explosion of mental “traumas” in our psychiatry-obsessed world. One may observe, for instance, that there is no record of anything remotely resembling PTSD in the premodern world. It emerged in the 20th century, either as a result of new weapons inflicting new kinds of damage (brain injuries), or from psychiatrists’ egging people on, or both. If the received narrative about it were true, then all of Cambodia ought to have gone completely insane in recent times. It did not happen. Likewise with rape. One struggles to find any mention of long-term trauma from rape for most of human history. The ancients were not very chatty about it. Of course, they saw it as wrong, as is rather easy to do, but their notions about it were not ours. Rape does impose reproductive costs, but so does cuckoldry, and being cuckolded does not cause chronic trauma. Nor would claiming that it had done so to you do much for your social status. Sadly, exactly one person in existence has the balls to comment on this rationally. Many of these problems seem to originate from something more diffuse, something about the cultural zeitgeist of our age, rather than a particular field or bureaucracy.

It is generally agreed upon in the modern West that sexual activity before late adolescence, especially with older individuals, is liable to causing trauma of the chronic kind. This alone should give one pause, since “adolescence” is a linguistic abstraction with only very recent historical precedent, and many of the biopsychological processes which are conventionally attributed uniquely to it begin earlier and persist long after. The onset of stable, memorable sexual desire and ideation occurs at the age of ~10 (it was certainly present in me by age 11), commensurate with gonadarche, and is certainly almost universally present by age 12-13. The reason these desires arise with gonadarche is simple: they exist to facilitate reproduction. It would make little biological sense in any species other than humans to experience sexual desire but also experience some strange latency period of 1-8 years (depending on the country) during which any acting upon those desires causes inconsolable soul-destruction. Any time something seems completely unique to humans, one has to wonder if it has something to do with uniquely human cultural phenomena such as taboos. It is even more obvious when one observes human cultures which lack these taboos, e.g. Classical Greece. When they married their daughters off at age 13-14, they were concerned chiefly about whether the groom could provide her and her children with a stable living. But they were not concerned about soul-destruction. At least, I’m fairly sure of that. For the record: this is not an endorsement of lowering the age of consent. I am decidedly neutral on that question, but I do not believe Mexico’s answer is any less correct than California’s or vice versa.

It is wrong to say that psychiatrists, or therapists, have a superpower of changing people’s phenotypes. This is impossible, as any such change they could impart would be genetically confounded, i.e. it is genetically non-random sample of the population who are “successful” subjects to their interventions. So it seems fair to assume that a lot of mental health problems are explicable in this way rather than through straight-up iatrogenesis, and their prevalence is inflated somewhat through media hype and social media shenanigans. However, an interesting question is: how much of an evolutionarily novel phenomenon is the field of psychiatry? Are our minds equipped to deal with it? Well, not everyone’s. It seems possible to confect illnesses out of thin air if you subject the right person to the right conditioning, as is the case with the probably purely iatrogenic “dissociative identity disorder.”

Masses of people these days shell out large chunks of their finances on “therapy,” a form of psychiatric intervention which has shown itself to be of at best mixed efficacy. Many long-running randomised controlled trials of its effects turn up jack shit, which ought not to be shocking given what is known about the non-effects of education, extensively documented by Bryan Caplan and others. It has to change the brain in a dramatic way. Still lingering though, is the question of whether it may in fact make matters worse. Many social commentators have taken notice of the way in which mental illness, especially “depression,” seems to be afforded a kind of bizarre social status in some circles, such as within university culture in Canada. Even more galling is that it is not even clear whether “depression” of the garden variety is a disorder; it may be an adaptation that evolved to ward people off hopeless pursuits. Status is a powerful motivator, so this weird grievance culture cannot help, but encouraging people to make their living from talking to such people and consoling them with soothing words cannot be great either, since it is likely to induce the kind of institutional inertia on which the pointless continuance of America’s “drug war” is sometimes (correctly) blamed.

Legalising drugs and investing more energies into high-precision “super-drugs,” e.g. powerful mood-enrichers with no side effects, would do more for the true chronic depressives who literally have never even known what it means to be happy – a malady probably induced by rare mutations if it exists – than what is on offer today. Drugs are the only guaranteed way to do profound psychological re-engineering without gene-editing. It is not clear, though, if the psychiatric industry as it currently exists would be happy to see such problems vanish.

Diseases, Disorders and Illnesses, Oh My

This will serve as an addendum of sorts to my article, The Harmless Psychopaths.

Medicine as a science is a modern phenomenon. It was not all that long ago that going to a doctor was more likely to hurt than help, a fact which persisted, some think, until as late as the 1930s. Medical researchers tend to just keep plugging away at their specialist interest and are unconcerned with what to them seem like instrumentally useless philosophical minutiae. Moral philosophers might argue about meta-ethics: the essence of moral statements, but this does not seem a necessary prerequisite to a relatively harmonious social order. One might just as well ask what is the use of “meta-medicine,” to wonder at the underlying assumptions of medical diagnoses, when scientists are quite happy getting on with finding cures for cancer, and whatever else.

Unfortunately, medicine is as subject to such human frailties as status-seeking and fashion as anything else. It became unfashionable in the 20th century to look for pathogenic causation to diseases thanks to the then nascent science of genetics, which is why it was not accepted as common knowledge that bacteria cause peptic ulcers until the 1980s despite this having been suspected, on good evidence, for well over a hundred years. Note that most cancers are only dimly heritable, in contrast with, say, autism, and have no clear Mendelian inheritance pattern. (Fill in the blank.)

Medicine is an applied science and so obviously has a prescriptive dimension to it, i.e. what is worth treating? Call this is the meta-medical question if one likes. The answer to this is not so complicated when dealing with physical disorders which glaringly go against the sufferer’s interests and those of peers, such as the flu, atherosclerosis, whatever. But what about disorders of the mind? Surely a meaningful concept, but surely far more prone to spurious theorising and fashion-biases in answering the meta-medical question, due to the diversity of moral viewpoints about what is “disordered” behaviour. For the purposes of this post, I use the terms disease, disorder, and illness interchangeably – which they more or less are in everyday usage.

This is how the DSM-IV defines mental disorder:

A. a clinically significant behavioral or psychological syndrome or pattern that occurs in an individual

B. is associated with present distress (e.g., a painful symptom) or disability (i.e., impairment in one or more important areas of functioning) or with a significantly increased risk of suffering death, pain, disability, or an important loss of freedom

C. must not be merely an expectable and culturally sanctioned response to a particular event, for example, the death of a loved one

D. a manifestation of a behavioral, psychological, or biological dysfunction in the individual

E. neither deviant behavior (e.g., political, religious, or sexual) nor conflicts that are primarily between the individual and society are mental disorders unless the deviance or conflict is a symptom of a dysfunction in the individual

The inadequacies of this are manifold and torturously obvious. Childbirth seems to fit quite snugly with condition B. Also, it is generally unhelpful to include a word itself or its synonyms in its own definition, such as in D. with “dysfunction.” E. seems to take it as read that the distinction between biological dysfunction and normal deviance is obvious, yet it is apparently not to most psychiatrists. It is for that reason that the traits branded “psychopathy,” for example, are continuously distributed in the population and usually harmless, but there exists an arbitrarily defined cut-off at the right tail of the distribution where it is conveniently labelled “disorder,” and the relevant convenience is just relative to the interests of whosoever finds these traits unappealing or whoever lacks the theory of mind to understand them. See also: ADHD, and teachers.

How to get around this arbitrariness? If ADHD and psychopathy are not useful to us WEIRDos, who or what are they useful to? Well, they are adaptations: they have a fitness benefit, i.e. a reproductive edge, in at least some environments, even if they are unpalatable to individual persons. This evolutionary view is what tempts some to propose a purely Darwinian definition of disease in which disease is conceived as any embodied phenomenon that is counter-adaptive across all environments. This would make homosexuality a disease, but “Asperger’s syndrome,” “ADHD,” and “psychopathy” not. This could certainly be illuminating from a solely descriptive angle where the only interest is to scientifically describe the causes of disease, but it is useless to practitioners of medicine and psychiatry, for whom the relevant question is “What ought to be treated?” If one asks doctors what the problem is with flu, they are unlikely to say anything about how it affects one’s reproductive chances, and, well, it doesn’t. Not much.

Whether one finds this distasteful to mention as a dispassionate intellectual, it is also a fact that the word “diseased” in popular usage carries a certain moral valence, even when applied to activities that one does not think morally important. To say that “Behaviour X is a disease” is not simply to say that it is evolutionarily maladaptive, but that it is wrong. This would seem an unhelpful confusion.

For the application of medicine, I tentatively suggest that what I think is the best formulation of the conventional usage of “disorder” be merged with the Darwinian definition: anything, internally generated (which may be another bone of contention, but that is a separate topic), which leads to non-trivial suffering in the individual and also has no conceivable fitness benefit. As for the descriptive-only theorists and researchers, the Darwinian definition is fine on its own, although perhaps it is worth while to find a word other than “disease.”

Unrealistic Adaptations

Of all the mental shortcuts embedded in human languages which impede understanding of mindless processes (such as natural selection), few are more obnoxious than “because.” From this comes a tendency to anthropomorphise, and read all outcomes in nature as if they were ordained by something approximating an “intention.” Religion has to be an adaptation, because the religious (currently) outbreed the irreligious.” The second clause in that sentence is (currently) correct, but the “because” makes it sound as though the current religious selection advantage represents some “design feature” with the desired (by what?) end of promoting reproductive fitness (adaptation). And fitness is where the matter rests. Contrast with the following sentence:

“Under current conditions in which the religious outbreed the irreligious, religion is adaptive.” This statement is of course tautologous, since to say that a trait or behaviour is adaptive means merely that under condition X it gives one a reproductive edge. The term “adaptation,” though, is often applied to traits or behaviours which are selectively neutral or even counter-adaptive in particular environments. Genes which contribute to an overzealous appetite may be fitness-neutral to a subsistence farmer but become obesogenic in the modern world of easily available food. The genes’ carrier still exercises this “adaptation,” but it is no longer adaptive, reproductively useful, except in an environment full of fat-fetishists.

Human society has changed so dramatically in the last two centuries that it would be hasty to say the least to assume that everything with a current selective disadvantage is an “illness” (due to pathogens, mutational load, or whatever). Just as equally, one cannot assume that something with a current advantage exists having evolved by resolving an adaptive problem. Religion was ubiquitous across cultures before the 20th century, yet now the religious fraction represent an ever tinier percentage of the population in many countries, and it remains to be seen just how tiny the “genetic hard core of religiosity” will get before the trend is reversed. If the presence of religion were explicable in terms of fitness benefit, why are the genes not already more widespread? This alone should be enough to tell you that genes (and thus, adaptation) per se had little to do with religion’s evolution.

But apparently this is not obvious to some. Many people are inclined to view adaptations as intricate mechanisms, which by dint of their intricacy are delicate and susceptible to dysfunction, rather like the springs and levers of a pocket-watch. All analogies are imperfect, but this is a useless one. Some traits, and indeed behaviours, are more prone to changing by exogenous insults than others. For instance, a particularly naive person might imagine that in a pandemic of severe endometriosis, whereby female beauty and youth cease to be predictive as indicators of fertility, males would be disincentivised from their sexual attraction to these traits because the attraction would no longer perform its original “functions.” Needless to say, this would not happen. Male callogamy (“attention to beauty”) has proven so reliably fitness-enhancing over the eons, since even before the human species, that it is extraordinarily resilient to any incentive change: selection will always favour a deterministic developmental pathway for such consistently valuable traits. General intelligence is yet another example: the current dysgenic trend is a product of the last few generations and on the order of ~1 point per generation despite ramping up of mutational load globally (we’ll see how long it can last), and almost no non-genetic factors seem capable of depressing its expression to any appreciable degree. Lead looked plausible at some point, but then you remember that Victorians liked to use mercury in their make-up, and yet the 19th century was the most intellectually productive in human history.

Viewed under this light, the “religion as adaptation” thesis looks all the more dubious. Evolutionary forces – selection, mutation, drift, etc – are just as capable of acting on general intelligence and other psychological traits as anything else, hence the well-documented evolved changes in the European peoples since around AD1000: declines in violence, and probably gains in intelligence, culminating ultimately in the zenith of the 19th century. Evolution can indeed happen fast, but not that fast. The bulk of these changes took place over a period of, at minimum, 20 generations, not 2-3, and our intelligence has more or less survived the last 2-3 generations intact. Religion has not. It has none of the hallmarks of an adaptation, but all the hallmarks of a complex socially learned behaviour, maintained by powerful norm-enforcers and epistemic authorities, which has lost currency in recent decades for a variety of reasons, the most commonsense explanation being that it no longer appeals to the educated because the answers it gives are inferior to those of other epistemic authorities, i.e. scientists.

The human capacity for cultural transmission through language makes a nonsense of the notion that anything which is not adaptive, even across all environments, should be impossible to sustain. The most obvious example in Christian cultures is the vow of celibacy, and there are numerous others such as taboos against eating highly nutrient-dense foods, which persist among the undernourished tribes of Papua New Guinea. So too with the European wars of religion, which resulted in millions of young men dying childless in their haste to protect a non-existent natural resource, i.e. God’s favour.

Group selection is another temptation when formulating theories about the origin of religion – the idea that even a behaviour which reduces fitness at the individual level can persist if it provides some advantage at the level of the social group. It is a neat idea, but clearly unworkable in practice. Suppose some cohort of one’s country likes spreading the word of God through warfare – call this behaviour X. They can seize new territory in God’s name and provide new land for others in their group who are not quite so zealous, and this may look like a “success” to the people who reap those rewards, but at the end of the day: the behaviour is still going to diminish because everyone who engages in it is at a massively elevated risk of dying before reproduction. Evolution does not care about states or dominions.

It is understandable why post hoc stories about religion as adaptation are popular, even among well-informed people. Intelligence is not a good predictor of having sensible views where political matters are concerned, since politics is about group loyalty more than anything else. This is why the number of US Democrats who thought immigration was an important social issue declined precipitously in the 2010s when it became the issue “of” the right; what mattered was showing solidarity against rival political coalitions (i.e. the right) rather than the truth. Adaptive stories about religion seem to appeal an awful lot to European traditionalist-nationalists who are hoping to use Christianity as the conduit for some kind of renewed ethnocentrism to uplift the European spirit. The Chinese do not seem to need it, oddly enough. Nor even the Czechs, much closer to home. It did not work for Rome, and it sure as fuck won’t for us.