“We’ve Taken Care of Everything”: How Rush’s 2112 Predicts AI’s Threat to Creativity—and What It’s Doing to Schools
Rush’s 1976 album 2112 is usually taught in rock history as a progressive-rock landmark: a side-long science-fiction epic that helped save the band’s career and, more importantly, doubled down on artistic independence. But 2112 also reads today like a disturbingly accurate parable about a modern fear—what happens when a centralized system “takes care of everything” cultural, and the human impulse to create becomes optional, managed, and eventually discouraged. In Rush’s story, the tyrants are the Priests of the Temples of Syrinx. In the twenty-first century, the “priests” look less like robed bureaucrats and more like platforms, models, and the institutions that increasingly outsource judgment, taste, and even thinking to AI.
The Temples of Syrinx as a blueprint for automated cultural control
In “The Temples of Syrinx,” the Priests boast that they regulate not just politics but culture itself: “We’ve taken care of everything / The words you read / The songs you sing” (Rush). That’s not just censorship—it’s a claim to curation-as-authority, the idea that a central system can administer your cultural diet so thoroughly that curiosity and risk disappear. The Priests even celebrate the machinery behind the control: “Our great computers / Fill the hallowed halls” (Rush). In 1976, that line signaled futuristic oppression; in 2025, it sounds like a description of how digital infrastructure mediates what art is seen, heard, recommended, copied, and monetized.
A major difference, of course, is that today’s systems don’t need to ban a guitar to suppress creativity. They can drown it. Modern music discovery is heavily shaped by recommender systems, and a UK government-commissioned literature review notes long-running concerns that algorithms can reduce discovery and “homogenise taste,” while also disempowering independent artists—especially when the decision-making is opaque (Hesmondhalgh et al.). That maps cleanly onto Rush’s fictional world: you don’t have to forbid new music if you can ensure most people never encounter it, or if the system nudges everyone toward the same safe, low-friction options.
From “forbidden guitar” to “infinite content”: why AI can feel like the end of originality
In 2112, the protagonist’s crisis begins when he discovers a guitar and realizes its radical implication: “Let them all make their own music” (Rush). The Priests destroy the instrument, not because sound is dangerous, but because independent creation is dangerous to a system that claims to administer meaning. That’s the core of Rush’s warning: creativity isn’t just entertainment—it’s autonomy.
Generative AI complicates this autonomy in two ways. First, it can produce “good enough” text, images, and music at scale—so the cultural environment fills with plausible, polished outputs that require little effort to generate and little patience to consume. Second, these systems are often trained on existing human work, then used to compete in the same markets. The U.S. Copyright Office has explicitly framed this as a risk to the incentives that copyright is meant to protect, emphasizing that “facts on the ground are evolving” rapidly and that incentives for creative activity are a central concern as AI training and output scale (U.S. Copyright Office). Internationally, WIPO highlights how AI is forcing new debates in intellectual property—especially around authorship, ownership, and the protection of creative works in AI-rich environments (WIPO).
Even research that finds benefits from AI assistance often reveals a tradeoff that echoes the Temples of Syrinx: AI can make creation easier, but also more standardized. An academic study on generative AI and novelty reports evidence consistent with the idea that AI support can increase individual creativity while reducing the diversity of novel content overall—in other words, more output, less variety across the culture (Doshi and Hauser). That is basically Rush’s nightmare in a modern key: plenty of “songs you sing,” but fewer truly new directions, because the system learns the average and reproduces it.
“We have assumed control”: AI imitation and the erosion of trust in “who created this?”
Rush ends the suite ambiguously: after the protagonist is crushed, an ominous voice declares, “We have assumed control” (Milano). In the AI era, that line lands differently because the question “who made this?” is no longer trivial. When synthetic content can imitate bands, voices, and styles, the cultural ecosystem’s basic trust gets shaky. Recent reporting on AI-generated music impersonation—where tracks can circulate as if they belong to real artists—shows how streaming-era distribution plus generative tools can blur authenticity and authorship in practice (Monroe; Richards). When the boundary between an artist and an algorithmic “sound-alike” collapses, the conditions for meaningful creative credit—and creative risk—collapse with it.
That matters historically for rock because rock culture often treats authenticity not as a marketing slogan but as an ethical stance: a belief that a voice comes from lived experience, friction, limitation, and craft. 2112 dramatizes that belief by turning a single handmade instrument into a symbol of human agency. AI doesn’t always “ban” the instrument; it can make it feel irrelevant.
AI in schools is hollowing out critical thinking and creativity
The most direct way 2112 predicts an “end of creativity” may not be the music industry at all—it may be education. Schools are where creative confidence is built, where students learn to wrestle with ambiguity, and where “thinking” becomes a practiced skill rather than a slogan. If AI becomes the default shortcut in that environment, the long-term cultural outcome looks less like a sudden dictatorship and more like a slow atrophy: students who can generate polished answers without building the mental muscles that produce real insight.
UNESCO’s global guidance warns that, over time, reliance on generative AI in education may have “profound effects” on human capacities “such as critical thinking skills and creativity,” and says those effects should be evaluated and addressed (Miao and Holmes 25). It also cautions that without sound pedagogical strategies, AI use may limit children’s creativity and originality, potentially leading to “formulaic writing” and fewer opportunities for “plural opinions” and “critical thinking” (Miao and Holmes 34). Even more bluntly, UNESCO flags that dependency on AI for suggestions can standardize responses and weaken independent thought and self-directed inquiry (Miao and Holmes 39). That is the Temples of Syrinx in classroom form: the system “takes care of everything,” and students gradually stop asking “how or why.”
Empirically, educators are already worried about exactly this outcome. Education Week reports teacher concern that AI will impede students’ critical thinking, even as many teens perceive it more neutrally (Klein). The tension matters: adults can see the cognitive costs, while students—surrounded by convenience—may interpret reduced struggle as improved learning.
There are also early research signals that “outsourcing thinking” changes learning behavior and outcomes. A randomized controlled trial studying the use of ChatGPT as a study aid measured long-term retention via a surprise test 45 days after learning, explicitly framing the risk as AI functioning like a “cognitive crutch” rather than a scaffold (Barcaui). Separate research on AI-assisted essay writing has raised concerns about “cognitive debt”—the possibility that repeated reliance shifts effort away from planning, synthesizing, and owning ideas (Kosmyna et al.). You don’t need to claim students are becoming incapable overnight to see the pattern: if the hard parts of thinking (structuring, revising, reflecting, making mistakes) are routinely offloaded, then students get fewer repetitions of the very processes that produce durable understanding and original voice.
Student perceptions also align with the fear that creativity is being flattened into “acceptable output.” Reporting on a study commissioned by Oxford University Press found students themselves describing AI as making schoolwork “too easy,” limiting creative thinking, and encouraging copying rather than original work (Adams). That is striking because it echoes Rush’s protagonist, who believes creativity flourishes when people “make their own music” (Rush). When students experience AI as the thing that finishes the assignment for them, they are less like empowered artists and more like citizens of Syrinx—consuming competence instead of developing it.
So the threat is not simply “cheating.” The deeper issue is the gradual replacement of process with product. Education is supposed to train the process: how to reason, how to argue, how to write like a person who has something to say. If AI becomes the default author, students may still submit fluent paragraphs, but they submit fewer thoughts.
Conclusion: Rush didn’t predict today’s AI—Rush predicted the temptation behind it
To say 2112 “accurately predicts” the end of creativity due to AI is not to claim Neil Peart forecasted large language models. It’s to claim he identified a recurring cultural temptation: when systems promise comfort, efficiency, and centralized “wisdom,” people slowly trade autonomy for ease. In 2112, that tradeoff is dramatized as a world where priests manage culture with “great computers” and claim to “take care of everything” (Rush). In our world, the same logic appears when recommender systems narrow discovery, when generative systems standardize style, and—most urgently—when students stop practicing critical thinking because a tool can produce the answer faster than they can form a question.
The rock-historical irony is that 2112 was itself Rush’s rebellion for creative independence (Milano). The album’s survival story becomes part of its message: creativity is not guaranteed by technology; it’s protected by choices—by refusing to let the “hallowed halls” decide what we read, sing, or think.
Works Cited (MLA)
Adams, Richard. “Pupils fear AI is eroding their ability to study, research finds.” The Guardian, 15 Oct. 2025, https://www.theguardian.com/technology/2025/oct/15/pupils-fear-ai-eroding-study-ability-research. Accessed 13 Dec. 2025.
Barcaui, AndrĂ©. “ChatGPT as a cognitive crutch: Evidence from a randomized controlled trial on knowledge retention.” Social Sciences & Humanities Open, vol. 12, 2025, article 102287. ScienceDirect, https://www.sciencedirect.com/science/article/pii/S2590291125010186. Accessed 13 Dec. 2025.
Doshi, Anil R., and Oliver P. Hauser. “Generative artificial intelligence enhances creativity but reduces the diversity of novel content.” arXiv, 14 Mar. 2024, arXiv:2312.00506, https://arxiv.org/abs/2312.00506. Accessed 13 Dec. 2025.
Hesmondhalgh, David, Raquel Campos Valverde, D. Bondy Valdovinos Kaye, and Zhongwei Li. “The impact of algorithmically driven recommendation systems on music consumption and production – a literature review.” GOV.UK, 9 Feb. 2023, https://www.gov.uk/government/publications/research-into-the-impact-of-streaming-services-algorithms-on-music-consumption/the-impact-of-algorithmically-driven-recommendation-systems-on-music-consumption-and-production-a-literature-review. Accessed 13 Dec. 2025.
Klein, Alyson. “Teachers Worry AI Will Impede Students’ Critical Thinking Skills. Many Teens Aren’t So Sure.” Education Week, 24 Oct. 2025, https://www.edweek.org/technology/teachers-worry-ai-will-impede-students-critical-thinking-skills-many-teens-arent-so-sure/2025/10. Accessed 13 Dec. 2025.
Kosmyna, Nataliya, et al. “Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay Writing Task.” arXiv, 10 June 2025, arXiv:2506.08872, https://arxiv.org/abs/2506.08872. Accessed 13 Dec. 2025.
Milano, Brett. “‘2112’: Rush’s Landmark Album Explained.” uDiscover Music, 1 Mar. 2025, https://www.udiscovermusic.com/stories/rush-2112-album-explained/. Accessed 13 Dec. 2025.
Monroe, Jazz. “King Gizzard & the Lizard Wizard AI Impersonator Removed From Streaming.” Pitchfork, 24 July 2025, https://pitchfork.com/news/king-gizzard-and-the-lizard-wizard-ai-impersonator-removed-from-streaming/. Accessed 13 Dec. 2025.
Rush. “2112 Lyrics.” Rush.com, Rush, https://www.rush.com/songs/2112/. Accessed 13 Dec. 2025.
U.S. Copyright Office. Copyright and Artificial Intelligence, Part 3: Generative AI Training (Pre-Publication Version). Library of Congress, May 2025, https://www.copyright.gov/ai/Copyright-and-Artificial-Intelligence-Part-3-Generative-AI-Training-Report-Pre-Publication-Version.pdf. Accessed 13 Dec. 2025.
WIPO. “Artificial Intelligence and Intellectual Property.” World Intellectual Property Organization, https://www.wipo.int/en/web/frontier-technologies/artificial-intelligence/index. Accessed 13 Dec. 2025.