On Agency and Accountability

Is it morally permissible for a 14-year-old to be enlisted in the military under any circumstance? The impulse this question elicits is one of disgust, founded on a historically and geographically local set of assumptions about moral agency and its relationship to age. The “true” existence of agency itself is contestable if taken to mean the free exercise of an internal will. In light of a determinist view of causality, it may be merely a legal heuristic – a means by which to differentiate the degrees of illusory freedom to action in developed and developing brains. The human brain does not mature until the mid-late 20s, and one can safely assume that no one is up for all their freedoms to be forfeited until age 25, since no such standard exists. With that off the table as a viable standard, we are left with varying degrees of randomness, stupidity, and (often hollow) virtue-signalling.

It goes without saying that reason postcedes rather than precedes morals – humans have moral emotions, which they justify through consequentialist reasoning only if such an expectation is placed upon them. Hence, discourse of the kind that follows here is rare.

One may ask, “What is true of a 14-year-old which, if true of an 18-year-old, would render the 18-year-old unfit for military service?” Intelligence is an untenable response, because intelligence does not scale linearly with age, and there are (and have been) plenty of legally adult military personnel with IQs in the 80s. The problem would be resolvable, perhaps, if age were a quantifiable trait, as intelligence is, as opposed to a numerical series of demographic cohorts, each with large individual variation in mental profiles. The reason it is illegal for anyone with an IQ below 83 to be inducted into the US military is not even “moral,” by the way. It is that such people are useless to the military. The moral argument, if any, is retroactive. No one seems to care how hard it is for them, really. Perhaps the “child” is a sacred demographic category, one which, under current conditions, must be extolled as a nexus of antediluvian bliss and innocence. By contrast, even acknowledging the unintelligent as a group with distinct needs is to be treated with indifference at best, or more often, suspicion.

It is legal in the UK to join the armed forces at age 16. Few who object to this seem interested in learning how many of these 16-year-olds regret the decision in hindsight, which would seem a good test of how “impulsively” the choice is made. It is well known that the military offers a kind of “escape hatch” of civic duty for teenagers with little hope of succeeding elsewhere, and many of them surely wanted to join from younger ages. The military, then, removes them from the morass of indecision and wastefulness that they would otherwise carry with them. “Impulsivity” is hard to measure. Maybe it can be defined in qualitative terms: a tendency to make decisions with low future-orientation. But if 14-year-olds are blighted by that, many more adults are, especially among the stupid.

Note, concomitantly, that the moral importance of emotions such as “regret” is not uniform. I argue that there are cases in which it can safely be called delusional or self-inflicted, as all emotions can be. In some cases, it simply does not feel significant enough to onlookers for it to be considered morally salient, such as the regrets of child actors about their profession as they become adults – something that few care to acknowledge.

Even the knee-jerk harm-reduction case against the military in the context of this argument has complications, because in most years, military deaths are low – probably lower than construction work or industrial fishing. Yet, no one cares for the lowly construction worker despite the fact that he has a 50% chance of having an IQ south of 92, and a, relatively speaking, alarming risk of fatality in any given year, not just those in which there is an ongoing war. Sudden, high-density death is more morally weighty to the average person than slow, diffuse, “accidental” death.

The hard determinism of brain-states, combined with knowledge of those states’ evolution through age, may have relevance to legal degrees of agentic “freedom.” If one compares the brain at different stages of development, the complexity and variety of its interactions with the world (“decisions”) at one stage may differ from those at another on aggregate, although exactly how morally salient the difference is will vary from individual case to case and may sometimes be doomed to subjective judgement call.

The flip side of agency (or freedom) is accountability, a concept which was once as ubiquitous as that of “freedom” today. The extent of brutal judicial punishment in premodern England was remarked upon by contemporaneous authors and looks absurd in retrospect. I suspect a large part of it was an environment of resource scarcity and technological deprivation: and the social tinderbox that these facts gave rise to – lawmakers may have felt that they could not but disincentivise, in the clearest way possible, antisocial behaviour, because whatever antisocials destroyed could not be rebuilt as quickly as today. They took no chances on recidivism. Another example would be the harsh punishments for sex crimes, in a world swimming with sexually transmitted pathogens but no effective medicine. This thesis could be empirically examined: are harsh punishments more common in deprived regions? Is clemency more common among the privileged classes? Etc. It also seems to dovetail with the assertion that our modern obsession with rights and freedoms is due to technologically generated luxury, without which the social order prioritises duties.

Old criminal justice may have been disproportionate and cruel. Nonetheless, if ever it were possible to quantify how well behaviour X at {age} predicts outcome Y at a later date, why deny agency, or accountability, in cases where it is indeed predictive? God knows who is qualified to make such calculations; probably no human, since humans are all preoccupied with sending virtue signals of endless freedom and protection. But, in principle, it ought not to be difficult to tell whether a child who murders is likely to do so again in adulthood, just as it is possible to make an algorithm that predicts recidivism within adulthood. In which case, why not hang the little shit? (Comedic exaggeration, of course. I do not endorse the death penalty.) After all, many traits are stable throughout the lifespan. What is now called psychopathy is usually one of them. I have no “solution” to dealing with age-dependent social norms, nor much hope of a “science” of agency and accountability ever coming to pass. Age and agency is nevertheless a conundrum of interest to thinking people.

It is doubly unlikely that such a science will emerge as social norms from high-status societies, such as the West, spread across the planet memetically. Eventually it will get to the point where people cannot distinguish the signal from reality, and we shall all pretend that the move in this direction was “scientific,” and “progressive,” as with child labour laws. A less pernicious example of exactly this is the recent trend of Arabs turning away from religion. By far the biggest predictive factor in (ir)religiosity at the national level is IQ, and this change is not due to the Arabs’ having gained IQ points, so they are probably just copying what they see as their social betters, in the West.

The arbitrariness of it all came to mind again when I saw someone on Twitter being called a “true sociopath” and “enabler of child-rape” because he was apparently endorsing changing the age of sexual consent to 15. The same was debated in Britain recently. Since the legal age is 16 here, presumably Britons are enablers of child-rape as far as the average American is concerned. It never occurs to them: maybe the concept of the post-pubescent “child” is highly socially fungible, and no one can even really agree on what it is, let alone what its rights or liabilities should be.

Such faux pas are ever-present, though, because extraordinarily few people have a picture of reality in their heads that integrates anything beyond the whimsy of the here and now. Many people struggle to wrap their heads around how different public opinion was on their political hobby-horse as recently as 20 years ago, never mind any further back.

And that’s not counting the surprising number of stupid and ahistorical things which even political dissidents believe.

PMS City

As with labels such as “schizophrenia,” and many besides, premenstrual syndrome is a symptomatological diagnosis – a category formed not on the basis of any known cause but on a loosely associated set of symptoms, which are many and vary in severity.

And none of them should exist. Evolution has had a near-eternity to chip away at the reproductive system, and for normal bodily processes to induce pain or debility ought to be selected out unless there is some obvious adaptive trade-off (the only example that comes to mind is giving birth). Furthermore, since these problems are experienced by only a subset of women, they are not an inevitable result of hormone changes.

An alternative explanation is pathogens. During the luteal phase of the menstrual cycle, the immune system is weakened to avoid destroying new embryos, leaving women vulnerable to infectious agents. Empirically confirmed associations of PMS symptoms with pathogens include chlamydia and trichomonas vaginalis, but there could easily be others which have either evaded precise investigation or have been ignored.

The psychic pain brought on by menstruation is well documented. Hippocrates spoke of it, but he was clearly talking about the “madness” that could come as an effect of the physical symptoms such as dysmenorrhoea, not an independent mania or irrationalism brought on by what we now call “being hormonal,” whatever that means.

About one-quarter of women report clinical symptoms of PMS, which are likely to be pathogenic, but a fairly decent percentage of them, with or without the disease symptoms, report other problems such as killing people, screaming Love Island-tier insults at household objects, crying incontinently, losing the ability to turn-take in conversation, psychotic paranoia, and wasting other people’s money.

Lots of physiological processes happen all the time which, theoretically, could have a noticeable impact on mood. Levels of cortisol, the “stress hormone,” shift throughout the day, peaking in the hours just after waking, and drinking alcohol has a far more dramatic impact on nearly all aspects of brain function than anything menstruation does. Yet, the Morning Cortisol Rage has yet to breach the popular lexicon, and the effects of alcohol are closer to being psychosomatic than is ordinarily assumed: it is not a human universal that drinking causes chimpanzee-like states of aggression and disinhibition as it does in Britain. So it looks like another anomaly of our time and place, a thing that exists because people want it to. The same probably holds true for – well, lots of things.

Childhood was a fun time. Maybe it’s not surprising that people love an excuse to return to it; some periodically, others pretty much all the time.

Pinksheet Yang

A couple of months ago, when an avalanche of Yang memes seemed to appear out of nowhere, Hunter Wallace pointed out (his youtube channel has been deleted so I can’t link to it) that this wasn’t organic, and that Yang was clearly getting a “boost” from somewhere. Wallace was certainly correct about that. It was clearly a coordinated, professional op, but by whom? I have some ideas about who was directing it and what the reasons were, but it doesn’t matter. It’s all speculation. It’s also hard to tell to what extent anything that originates from places like 4chan is even real anymore, or to what extent it ever was. That wasn’t Yang’s fault though. Many of his policies were good. If nothing else, $1000 a month is $1000 a month. Nothing else mattered. Yang’s candidacy was propelled in essentially a “pump and dump” scheme, similar to those used in the seedy world of pinksheets and penny stocks promotion. With that thought, how appropriate the “pink hats” were.

None of that was Yang’s fault though. He of course made a strategic error in failing to embrace his new “supporters” and capitalize on the momentum which was gifted to him by the powers that be. Many people were disappointed by this and quickly abandoned the yacht. Part of me found it kind of admirable though that Yang insisted on being true to himself, “math” and all, rather than latch onto some fleeting, trendy meme campaign and pretend to be an obnoxious shitlord.

Yang did make some real blunders though. His first error was the idea to announce some new policy everyday (can’t remember if it was for 30 days or 60 days.) Many of these proposals seemed to have just been pulled out of his ass or a result of poor advice. Things such as “lowering the voting age to 16” were totally unnecessary and alienated a lot of potential supporters. He failed to take his own advice and “focus on the money.” His big selling point was the $1000 per month. That is all he should have been talking about with the exception of a few other common sense stances on important issues of the day to show he was a serious, well-rounded candidate. Yang’s other serious error was in his over the top pandering to SJWs and Russia conspiracy airheads. There is no way that someone as smart as Yang really buys into all that nonsense. The same criticism I applied to Trump years ago, applies to Yang. Intelligent candidates are at their best when they boldly articulate what they believe in their hearts rather than tell people they think (or have been advised) voters want to hear. Even if it seems unpopular or like a bad move politically, you have to just take the heat and press forward, confident that you will be vindicated. Lead the people where you want them to go.

Finally, I didn’t watch the debates, but from every indication, Yang’s performance was a disaster. He squandered what little airtime he received to make statements like “Russia is hacking our democracy.” Yang clearly does not understand where his potential pool of support lies. There was a niche available to him which he has been too clueless to recognize and exploit. Look, I like Yang. I wrote 3 lengthy essays and made a youtube video expressing enthusiastic (by my standards anyway) support for him. There’s still a long way to go in the election. If he’s really good at math, maybe he can learn from his mistakes like a sophisticated computer. At this point though, I don’t believe Yang has what it takes. $YANG stock has tanked. Don’t be left holding this bag.

A Brief Look at “Incel” Hysteria

Individuals of the libertine persuasion, those who take delight in the kind of bland sex-positive advocacy for which there are now countless figureheads on the internet, have suggested to me many times that the availability of internet pornography is behind recent falls in sexual violence, beginning roughly in the early 90s. I did not think that was abjectly insane, but it was certainly hard to miss some inconvenient background details: homicide, and many other kinds of crime, fell in the 90s, which casts immediate doubt on pornography as causal. No one quite knows why it happened, but there are several hypotheses, some spicier than others.

It is important to mention that this period of declining violence (~ 1990-2005) is a mere blip within a trend which has been going on far longer. Technology, broadly construed, may be responsible for some recent declines, and before that, the aggressive genetic selection against criminality that took place in Europe over several centuries.

One wonders: when was the last time that rape was a viable reproductive strategy – that is, more likely to result in descendants than to result in imprisonment and ignominy? World War Two seems a good recent example: the Red Army managed to leave hundreds of thousands of descendants through the rape of foreign women of the nation they were at war with, but that is quite different in terms of its social consequences, i.e. has much weaker selective force acting on it, than raping women in one’s social in-group or clan. All this is amplified, too, by the availability of abortion, which renders births from rape in the First World basically non-existent at this point.

So, the type of rapist who rapes the enemy’s women in war has a distinct psychological profile, and is probably far more common, than the type who rapes at home. In the modern West, the latter type make up almost all “rape data” after controlling for immigration. Exactly what do we know about those people?

Genes do not seem to contribute much to the variance of propensity to rape adults in Sweden. Then again, possible confound: Sweden. However, I would not be surprised if the pattern holds true even in undiversified regions. For instance, it could be that the heritability of within-group rape has declined over time because its “adaptive” function, if it ever had one, is now a dead end. So now, it (within-group rape) survives because the genes that increase the likelihood of doing it are normally implicated in other, more reproductively successful behaviours.

All of this should cast doubt on the idea that the incel “phenomenon” will trigger a rape epidemic. Men who rape (again, excluding wartime rape) have different brains from men who do not, regardless of whether they are celibate. And anyone saying that men need more “sexual outlets” to ward off the incoming Incel Rape Army is full of shit. There has essentially never been a society where long-term relationships (or even short-term liaisons) are men’s only chance at sexual access, even where alternative means are “banned.” Pornography and prostitution are banned in South Korea, yet approximately 23% of men there have visited prostitutes.

The average marital age for males in western Europe has been 26-28 for centuries, and a goodly portion of the population never married, even back when marriage was far more of an idealised social norm. Current trends are really not all that terrifying.

A World of Trauma – Civilizational Psychosadomasochism and Emptiness

According to Google’s vast textual corpora, there was nary an instance of the term “trauma,” or its distinctly psychiatric derivative “traumatized,” in written English prior to the 1880s. The first usage of “trauma” is documented in the 1690s, at which point it referred to physical wounding only. Its “psychic wound” sense did not pick up until the tail end of the 19th century, which is now far more familiar to us than the original sense. Exactly what took root in the world between then and now? The standard narrative is that the medical profession became wiser, but what of the wisdom embedded in our species’ genetic history? Note that even most doctors and biomedical lab technicians know little of basic genetics, or, one has to assume, of evolutionary reasoning. I recall being sneeringly told by one, on introducing her to the concept, that she was only interested in “proper science.” This is about when it set in that even many “grunt-work” scientists are basically morons. She certainly was.

Applying the principles of natural selection (i.e. evolutionary reasoning) to find the aetiology of disease tends to yield different answers from those that are now fashionable. In a 2000 paper, “Infectious Causation of Disease: An Evolutionary Perspective,” the authors compellingly argue that a huge number of supposedly mysterious illnesses are in fact caused by pathogens – bacteria or viruses. The argument is simple: any genetic endowment which essentially zeroes fitness (reproductive potential) can be maintained in a population’s genes only at the basal rate of errors, i.e. mutations, in the genetic code, with the apparently sole exception of heterozygote advantage for protection against malaria. Thus, anything so destructive which rises above a certain threshold of prevalence should arouse suspicion that a pathogen is to blame. This would include schizophrenia, an alleged evolutionary “paradox,” with a prevalence of ~0.5%, especially since, unlike “psychopathy,” schizophrenia has low twin-concordance, low heritability, and is discontinuous with normal personality. At present, direct evidence of the pathogen is scant, but that is to be expected: viruses are tricksy. No other explanation is plausible.

What, then, when one turns the evolutionary lens towards “trauma”? What is commonly called psychological trauma can helpfully be divided into two categories: non-chronic and chronic. The former is what most people would call distress. It is adaptive to have unpleasant memories of situations that could kill you or otherwise incur significant reproductive costs, which is why everyone feels this. It is good to have unpleasant memories of putting one’s hand on an electric fence for this reason. It is bad, and certainly not evolutionarily adaptive, for the memory to continually torture you for years after the fact. I have it on good authority that this does nothing to attract mates, for example.

In light of this, it becomes clearer what may be behind the apparent explosion of mental “traumas” in our psychiatry-obsessed world. One may observe, for instance, that there is no record of anything remotely resembling PTSD in the premodern world. It emerged in the 20th century, either as a result of new weapons inflicting new kinds of damage (brain injuries), or from psychiatrists’ egging people on, or both. If the received narrative about it were true, then all of Cambodia ought to have gone completely insane in recent times. It did not happen. Likewise with rape. One struggles to find any mention of long-term trauma from rape for most of human history. The ancients were not very chatty about it. Of course, they saw it as wrong, as is rather easy to do, but their notions about it were not ours. Rape does impose reproductive costs, but so does cuckoldry, and being cuckolded does not cause chronic trauma. Nor would claiming that it had done so to you do much for your social status. Sadly, exactly one person in existence has the balls to comment on this rationally. Many of these problems seem to originate from something more diffuse, something about the cultural zeitgeist of our age, rather than a particular field or bureaucracy.

It is generally agreed upon in the modern West that sexual activity before late adolescence, especially with older individuals, is liable to causing trauma of the chronic kind. This alone should give one pause, since “adolescence” is a linguistic abstraction with only very recent historical precedent, and many of the biopsychological processes which are conventionally attributed uniquely to it begin earlier and persist long after. The onset of stable, memorable sexual desire and ideation occurs at the age of ~10 (it was certainly present in me by age 11), commensurate with gonadarche, and is certainly almost universally present by age 12-13. The reason these desires arise with gonadarche is simple: they exist to facilitate reproduction. It would make little biological sense in any species other than humans to experience sexual desire but also experience some strange latency period of 1-8 years (depending on the country) during which any acting upon those desires causes inconsolable soul-destruction. Any time something seems completely unique to humans, one has to wonder if it has something to do with uniquely human cultural phenomena such as taboos. It is even more obvious when one observes human cultures which lack these taboos, e.g. Classical Greece. When they married their daughters off at age 13-14, they were concerned chiefly about whether the groom could provide her and her children with a stable living. But they were not concerned about soul-destruction. At least, I’m fairly sure of that. For the record: this is not an endorsement of lowering the age of consent. I am decidedly neutral on that question, but I do not believe Mexico’s answer is any less correct than California’s or vice versa.

It is wrong to say that psychiatrists, or therapists, have a superpower of changing people’s phenotypes. This is impossible, as any such change they could impart would be genetically confounded, i.e. it is genetically non-random sample of the population who are “successful” subjects to their interventions. So it seems fair to assume that a lot of mental health problems are explicable in this way rather than through straight-up iatrogenesis, and their prevalence is inflated somewhat through media hype and social media shenanigans. However, an interesting question is: how much of an evolutionarily novel phenomenon is the field of psychiatry? Are our minds equipped to deal with it? Well, not everyone’s. It seems possible to confect illnesses out of thin air if you subject the right person to the right conditioning, as is the case with the probably purely iatrogenic “dissociative identity disorder.”

Masses of people these days shell out large chunks of their finances on “therapy,” a form of psychiatric intervention which has shown itself to be of at best mixed efficacy. Many long-running randomised controlled trials of its effects turn up jack shit, which ought not to be shocking given what is known about the non-effects of education, extensively documented by Bryan Caplan and others. It has to change the brain in a dramatic way. Still lingering though, is the question of whether it may in fact make matters worse. Many social commentators have taken notice of the way in which mental illness, especially “depression,” seems to be afforded a kind of bizarre social status in some circles, such as within university culture in Canada. Even more galling is that it is not even clear whether “depression” of the garden variety is a disorder; it may be an adaptation that evolved to ward people off hopeless pursuits. Status is a powerful motivator, so this weird grievance culture cannot help, but encouraging people to make their living from talking to such people and consoling them with soothing words cannot be great either, since it is likely to induce the kind of institutional inertia on which the pointless continuance of America’s “drug war” is sometimes (correctly) blamed.

Legalising drugs and investing more energies into high-precision “super-drugs,” e.g. powerful mood-enrichers with no side effects, would do more for the true chronic depressives who literally have never even known what it means to be happy – a malady probably induced by rare mutations if it exists – than what is on offer today. Drugs are the only guaranteed way to do profound psychological re-engineering without gene-editing. It is not clear, though, if the psychiatric industry as it currently exists would be happy to see such problems vanish.

Emily Ratajkowski’s Modest Proposal

In response to Alabama’s recent, controversial, abortion legislation, model and former Blurred Lines music video star, Emily Ratajkowski, posed nude on Instagram, bemoaning how the bill would “perpetuate the industrial prison complex by preventing women of low economic opportunity the right to choose to not reproduce,” and further how: “the states trying to ban abortion are the states that have the highest proportions of black women living there.” Ratajkowski, a sex-positive feminist, was obviously blind to her implicit appeal to eugenics, but Breitbart journalist John Nolte jumped at the opportunity to push the recently-popular narrative that “Democrats are the real racists,” going so far as to claim that “Ratajkowski Believes Killing Black Babies Is a Public Service”, and accusing her of white-supremacy, even comparing her comment to “anything you would read at ‘The Daily Stormer’”. In fact, Ratajkowski’s sentiments are neither as “woke” as she’d like to think, or as fascist as Nolte would like to accuse her of but reflect a kind of pragmatism taboo to both the mainstream left and the mainstream right: an overlooked meeting point between humanitarian concerns and elitist/conservative-minded population control.

The relationship between legalized abortion and falling crime rates has, in fact, been well studied, with results pointing to the not-at-all surprising notion that requiring every pregnancy to go to term, no matter how unwanted or inauspicious, may not actually be great for society. Many scholars cite Roe V. Wade as the prime culprit in the staggering, American, crime-drop of the 1990’s, for example. The demographic angle on this truth is touchier but also founded in reality. Blacks are disproportionately likely to be affected by those conditions which lead to poverty and crime, and sure enough, the most recent data shows them representing 54% of those incarcerated in Alabama despite representing just 26% of the population, and this still after comprising a majority of the aborted pregnancies in the state (62% in 2017). Ratajkowski’s point is basically that we as a society are churning out large numbers of people who are predestined by sociocultural conditions and the prison industrial complex to live miserable lives, and that this is especially obvious in a state like Alabama. It’s hard to imagine that inviting the number of single, black, Alabaman mothers to skyrocket—as does the state’s new abortion bill—wont perpetuate increases in poverty, general unrest, and higher incarceration rates in Alabama’s awful prisons (the deadliest in the country). Certainly, this outcome seems more likely than “one of those black babies” emerging from one of the worst public school systems in the country to “cure cancer” as John Nolte chides us.

It would seem that promoting absolute control over reproduction to those members of society most affected by adversity would be something that both humanitarians of the left, and those concerned with conserving social order and demographics on the right should find common ground on—but such agreement is far from sight.

As exemplified by the Nolte article, the right is utterly delusional on this point: willing only to make moralistic arguments against abortion, and as a result unwilling to engage with any arguments for it no matter how pragmatic. Allergic to coupling their opposition to abortion with any reasonable plans to increase the social welfare of those most likely to seek it out, the pro-life GOP must rely on the myth that anyone can lead a good and productive life if only they pick themselves up by their bootstraps. As usual, their exaggerated focus on the individual and personal responsibility makes them blind to dynamics that can only be grasped on a larger scale. The pro-life movement is part the same principles-based conservative tradition that supports starting foreign wars in order to spread “freedom and democracy” worldwide. It riles up a certain, obnoxious, segment of the population but its big-picture, long-term, effects are disastrous and tend to have the very opposite effect of “conserving” anything. “Never mind all the refugees! Never mind all the unwanted children! Freedom and democracy are absolute ends in themselves, and abortion is murder!” In this sense, the pro-life victory in Alabama must be viewed in the same vein as so many other “accomplishments” of the Trump era: rather than moving conservatism toward something more nationalistic and pragmatic—as promised and as is necessary—Trump has come to embody the last desperate gasp of boomer Conservative talking points.

The left occasionally makes valid points about abortion but no longer connects them with any broader program for maintaining a healthy, cohesive society. Rather, the pro-choice movement is now packaged with disastrous policies like laissez-faire immigration, support for the reproduction-incentivizing welfare state, and, increasingly, a general devotion to demographic change and cultural dissolution. And yet, occasionally someone like Ratajkowski comes along and says something on the abortion issue that makes one grow nostalgic for the more sensible tone of the early 20th century progressive era.

There is, of course, a precedent for talking about the eugenics of birth control and the historical figure who best represents these ideas is none other than progressive era figurehead and Planned Parenthood founder Margaret Sanger. Pro-lifers love to talk about how Sanger was pro-eugenics and therefore basically Hitler. Many of them would be surprised to learn that Sanger was in fact anti-abortion and simply a radical proponent of contraception. She was indeed pro-eugenics but wasn’t a mere social Darwinist. For Sanger, eugenics and a humanitarian concern for the poor went hand-in-hand. Not only would birth-control reduce the population of an underclass whose high fertility rates had a demonstrably negative impact on society, but also it could improve that class’s standard of living. Just as critics of immigration accurately point out that immigration has a negative impact on the citizens of a country who must compete with new arrivals for jobs, Sanger argued that promoting birth control to adversely affected communities would empower them to advance. That such a promotion of birth control would also have a eugenic effect was simply another, complementary benefit. Her biggest crime, it seems, was to question the idea that all lives are necessarily good and valuable things—a cardinal offense in a mass-democratic society.

The fact is, it’s easier to virtue signal against eugenics than to provide the underclass a decent life. Leave it to the irrational banter of the culture wars to prevent us from having a more productive conversation about reproductive rights.