top of page

 To make clear and explicit the necessary macro scale dynamics – so they can be taken as design constraints 

Civilisation X

a currently unknown hypothetical civilisation design that would [A] align collective action and civilisational values [B] stabilise the balance of power and safeguard against totalitarianism and; 

[C] eliminate the material (scarcity) and cultural (ideology) generative conditions of conflict and control – to ultimately realise existential security and flourishing (utopia). 

 

Civilisation X is a more philosophically precedented, EA-legible conceptual reframing of Halls’s ‘Game B’ – an alternative operating system to civilisation’s historically dominant system (Game A) – emulating MacAskill’s ‘Cause X’ - itself derived from Parfit’s ‘Theory X’: ‘a currently unknown hypothetical theory that would solve some important open problems in population ethics’. 

As well as a sharing it's conceptual form, Civilisation X –  civilisation design, transition and alignment –  is Cause X: “a cause that's one of the most important moral problems of our time, but that we haven't even clearly conceptualized yet”. 

Constituent Concepts

Game B (Hall, Rutt): Conceptualized by Jordan Hall and Jim Rutt, Game B proposes a necessary transition from our current "Game A"—a rivalrous, zero-sum system characterized by resource extraction, competition, and exponential growth on a finite planet. Game B is a hypothetical, anti-rivalrous social operating system designed to be "omni-win." It seeks to maximize human flourishing and coherence without relying on the growth-dependent dynamics that currently drive the "meta-crisis" (the convergence of ecological, economic, and sense-making failures).

Theory X (Parfit): Originating in moral philosophy, this is Derek Parfit’s search for the elusive "Unified Field Theory" of population ethics. Parfit struggled with the "Repugnant Conclusion"—the logical paradox suggesting a massive population of barely happy people is "better" than a small population of ecstatic people. Theory X is the placeholder for the undiscovered moral formula that would resolve these paradoxes, allowing us to ethically weigh the value of future lives without arriving at repugnant or counter-intuitive conclusions.

Cause X (MacAskill): A central concept in Effective Altruism, Cause X represents epistemic humility. It is the recognition that, just as previous generations were blind to the moral catastrophes of their time (e.g., slavery), we are likely ignoring a moral issue of overwhelming importance simply because we lack the conceptual tools to see it yet. It treats the "search for the most important problem" as a high-priority problem in itself, urging us to keep resources liquid for when this unknown cause is revealed.

High-Level Frames

Civilisation Design Criteria & Constraints (Schmachtenberger): Daniel Schmachtenberger frames civilization not as an organic accident, but as an engineering challenge governed by hard boundary conditions. This framework asserts that for any civilization to be "non-self-terminating" in the long run, it must adhere to strict constraints: it must have closed-loop material flows (no waste/pollution), anti-rivalrous incentives (where individual success does not require collective harm), and a high-fidelity information ecology. Any system that violates these constraints will inevitably succumb to entropy and collapse.

Macrosecuritisation: this concept describes the shift of the "referent object" of security from the Nation-State to Humanity or the Biosphere itself. It is the political act of framing specific systemic risks—such as unaligned AI or ecological collapse—not as problems to be managed within normal politics, but as existential threats that threaten the survival of the entire system. By establishing these threats as "macro-security" issues, it creates the legitimacy required for extraordinary measures, global coordination, and the suspension of standard sovereignty in favor of planetary governance mechanisms.

bottom of page