Part 2

Chapter 7

PayPal's Philosophers

This chapter traces how Silicon Valley's founding generation laundered Galton's vocabulary of ranked fitness into the idiom of meritocracy, disruption, and market sorting — making the eugenic statistical tradition newly respectable at precisely the moment its policy implications were most consequential.

Drafting

Synopsis

The hereditarian argument — that a normally distributed, substantially heritable cognitive capacity explains the social hierarchy, and that the hierarchy is therefore natural and beyond the reach of redistribution — was Galton’s argument in 1869. It is also the argument of The Bell Curve in 1994 and the implicit premise of Silicon Valley’s meritocratic ideology in 2026. What changed between these three iterations was not the claim but the institutional vehicle: from state eugenics to psychometric testing to venture capital hiring, the ranked distribution was put to work in different settings, with different instruments, serving different constituencies. This chapter traces the statistical inheritance from Galton’s anthropometric laboratory to the PayPal Mafia’s hiring philosophy and from there to the neoreactionary political theory that Thiel-adjacent circles have found plausible. The through-line is not conspiracy. It is a statistical grammar — reification, ranking, naturalisation of hierarchy — that persists because it is useful, not because it is true.

1. The Statistical Inheritance

In 1884, Galton charged visitors threepence and enrolled them, unknowingly, in a ranked population distribution. The laboratory produced no clear evidence of a single underlying intelligence: the correlation between sensory acuity and eminence was weak. But it established the methodological template that would outlast its failed measurements: gather observations across multiple dimensions, look for systematic correlations, extract a single underlying factor that ranks human beings on a linear scale. Twenty years later, Charles Spearman completed the project Galton had begun. His 1904 paper, “‘General Intelligence,’ Objectively Determined and Measured,” extracted g — a general factor — from the matrix of positive intercorrelations among schoolchildren’s test scores, and treated this mathematically extracted latent variable as a real property of individuals: something that existed in varying quantities, could in principle be inherited, and could be measured. Stephen Jay Gould’s precise diagnosis — reification and ranking — names both moves: the statistical abstraction treated as a biological reality, multidimensional human variation compressed onto a single ascending scale.

Cyril Burt’s mid-century twin studies provided the heritability anchor for the educational sorting that Spearman’s g made ideologically available. The studies were fraudulent: after Burt’s death in 1971, Leon Kamin noticed that reported correlation coefficients remained identical to three decimal places across publications with changing sample sizes — a statistical impossibility. The educational system that those numbers had helped justify — the 11-plus examination sorting British children into educational tracks at age eleven — survived the fraud. Arthur Jensen’s 1969 Harvard Educational Review article used the fraudulent twin-study literature, alongside Spearman’s g framework, to argue that compensatory education was futile because racial gaps in IQ were substantially genetic. The Bell Curve (1994) restated Jensen’s claim at book length, with regression tables using the National Longitudinal Survey of Youth demonstrating that IQ predicted poverty better than a socioeconomic-status composite — a finding that required a thin SES proxy to suppress the structural variables that would have made IQ’s coefficient collapse to noise.

2. The Stanford Crucible

Peter Thiel, as an undergraduate at Stanford (1985–1989), co-founded the Stanford Review in 1987 as an explicitly anti-PC platform; he later attended Stanford Law School. The intellectual environment at Stanford in the late 1980s and early 1990s was saturated with the debate that Allan Bloom’s The Closing of the American Mind (1987) and Dinesh D’Souza’s Illiberal Education (1991) had catalysed: the idea that liberal universities had abandoned the pursuit of truth in favour of ideological conformity, and that the appropriate response was to defend the Western canon and the meritocratic ideal against the cultural left’s assault. Thiel absorbed this framework and gave it a specific economic inflection: the meritocracy was not just intellectually superior to the egalitarian alternative but was the mechanism by which cognitive talent was correctly sorted and compounded into exponential returns.

René Girard, Thiel’s favourite philosopher and mentor at Stanford, provided the theoretical vocabulary that made the meritocratic faith coherent: mimetic desire produced conformity, conformity produced mediocrity, and the truly creative individual — the zero-to-one entrepreneur — was the one who escaped the mimetic trap and created something genuinely new. In Thiel’s synthesis, Girard’s account of the exceptional individual who escapes crowd psychology becomes an account of the exceptional cognitive individual who escapes the normal distribution’s median. The power law — the empirical pattern in venture returns, startup outcomes, and tech market capitalisation where a small number of hits account for most of the value — was for Thiel not just a market phenomenon but a natural law, an expression of the underlying cognitive distribution that the market was merely revealing.

3. PayPal and the Meritocracy Machine

PayPal’s hiring philosophy was an operational expression of the meritocratic faith. Thiel, Levchin, and their co-founders were explicit: they were looking for the cognitively exceptional. The hiring processes favoured people who had performed well on standardised tests, competed in academic olympiads, and demonstrated the kind of problem-solving precocity that Spearman’s g tradition claimed to measure. The goal was to concentrate a high-g population in a single organisation, on the thesis that the variance of a company’s output was driven by its top performers rather than its average, and that a small number of exceptional hires could produce returns that would dwarf anything achievable through conventional workforce scaling.

Thiel’s Zero to One (2014) makes the population reasoning explicit. Against the optimist and pessimist views of the future, Thiel argues that the correct position is the “contrarian” one: most people are wrong most of the time because mimetic pressure produces convergence on consensus, and the exceptional investor or entrepreneur is the one who identifies what the crowd does not. Whether or not the entrepreneur has Galton’s natural ability or Spearman’s g, the structural position is identical: there is a distribution of human capacity for original thought, the distribution is stable and partially heritable, and the investor’s job is to find the people at the right tail. This is Galton’s programme without a state breeding policy — it does not need the state because the market does the sorting, and venture capital does the amplifying.

4. Neoreaction as Control Theory

Curtis Yarvin, writing as Mencius Moldbug from 2007 onward, developed the political theory that absorbed and radicalised the meritocratic premise. His central claim was that liberal democracy was a corrupted control system: the “Cathedral” — his term for the combination of elite universities, mainstream media, and progressive policy networks — was a feedback system that amplified noise rather than signal, producing increasingly incoherent policy outputs because its error-correction mechanisms had been captured by ideological interests. The solution was to replace the corrupted feedback system with a functional one: a sovereign entity with clear ownership, well-defined objectives, and the authority to correct errors without the friction of democratic representation.

This is cybernetics applied to constitutional theory, using the vocabulary of control systems to argue for autocracy. The Cathedral is the social feedback loop whose error signal has been corrupted; the solution is to redesign the feedback architecture to restore correct error-correction. Yarvin’s framework is explicitly indebted to Shannon’s information theory and to the broader Macy Conferences tradition he absorbed through secondary sources; the Cathedral is a noisy channel whose signal-to-noise ratio is too low to produce accurate policy. The argument that democracy is a corrupted information-processing system that should be replaced with an optimised managerial sovereignty is the logical endpoint of the social-cybernetics tradition Wiener had warned against in 1950: the application of system-correction logic to governance, where the deviations to be corrected are the political claims of people who disagree.

5. The New Anthropometric Laboratory

The contemporary iteration of Galton’s project is genomic and explicit. Elon Musk has publicly endorsed population-level eugenic reasoning and expressed concern about what he characterises as the differential fertility of high- and low-intelligence populations. The venture capitalist Marc Andreessen’s 2023 “Techno-Optimist Manifesto” endorsed the view that technology would allow the selection of “better” human characteristics. The genomic prediction company Genomic Prediction offers pre-implantation genetic testing for polygenic scores including predicted IQ. The service takes blastocysts produced through IVF, genotypes them, ranks them by predicted cognitive aptitude scores derived from genome-wide association studies, and advises on which embryos to implant. It is Galton’s Anthropometric Laboratory at the level of the genome: many measurements, one ranking, and the visitor — now the prospective parent — receives a card with the rank, and something stays behind in the laboratory.

The chapter’s arc closes at the point where it began: a small fee, a ranking instrument, a duplicate that serves a purpose different from the one the paying participant understands. The instruments are different — SNP arrays instead of dynamometers, polygenic scores instead of grip-strength centiles — but the mathematical grammar is the same. A normal distribution of a heritable quality; a position in that distribution as the measure of worth; a technology that makes the position visible, bankable, and actionable. What Galton proposed as a state programme the market has made a consumer service. The eugenic ledger is still open. It moved from the archive to the cloud.

Connection Forward

Part III opens with Chapter 8’s examination of Effective Altruism — which applies the utilitarian cost-effectiveness tradition of Chapter 6 at global scale, treating poverty as a tractable engineering problem and quantified impact as the measure of moral worth. EA is the philosophical culmination of the optimisation ideology whose institutional history Chapters 5 and 6 have traced; it is also the framework whose assumptions Chapter 8 will begin to deconstruct.

Key Claims