2.4 billion years ago, cyanobacteria began poisoning the planet. They didn't mean to; they were simply metabolizing, turning sunlight into energy. Their waste product, oxygen, accumulated until it transformed Earth's atmosphere, triggered a mass extinction, and inadvertently created the conditions for complex life. The cyanobacteria had no plan, no vision, no control over this planetary transformation. They were merely doing what cyanobacteria do.
What if we are the cyanobacteria of the cognitive age?
I don’t pretend this is an easy thought; I am trying it on, just as you might. Consider this: every AI model we train, every algorithm we deploy, every neural network we optimize may be byproducts we excrete. Just as cyanobacteria couldn't conceive of oxygen-breathing organisms, we may be equally blind to what emerges from our technological metabolism. We call it "artificial" intelligence to maintain the illusion of authorship, but what if intelligence is simply achieving a new substrate through us, as inevitable as oxygen accumulating in ancient seas?
This perspective aligns with John Gray's posthumanist philosophy, outlined in Straw Dogs (2007)1, which reveals human significance but not in the usual way. We may be the essential biological accident through which Earth develops a new kind of atmosphere: one made not of gases but of information, computation, and emergent cognition.
In this post, I invite you to see our (a.k.a human) role in technology development as participants rather than creators. Taking this perspective enables another way of approaching AI development.
1. Beyond Human Creators
The cyanobacteria metaphor reveals a profound misunderstanding: our assumption that humans possess unique agency in shaping technology. In Gray’s view, this conviction - deeply rooted in Western society- rests on the idea of free will: the idea that we alone among Earth’s creatures can transcend our nature and direct our destiny.
Because if we are ‘free’ creatures, we also are free to create and destroy. Based on this, we imagine everything we’ve created so far as a product of our will, of our desire, as something we are equipped and free to do so. Rarely do we see our creations as demanded by natural forces inhabiting us and our surroundings.
This notion of us as creators has led us to believe we are the “creators” of artificial intelligence. We even call it “artificial” to signal it is not nature, but something we made. Despite the word ‘creator’ carrying the illusion of separation (through intention), creation may be nothing more than participation in a flow. This is unsettling to admit, I’ve always thought of myself as a maker, a builder, and I may not be the only one.
But if we see ourselves as cyanobacteria in another chapter of the planet’s long story, then perhaps we are not creators in the sense we believe, perhaps we are catalysts, participants, a necessary but passing configuration through which something flows. We don’t claim cyanobacteria “created” the ozone layer as a conscious act; likewise, AI may simply be a layer in our cognitive atmosphere. Think, feel, and meditate deeply on this perspective; it may humble you and liberate some weight from your consciousness.
Gray (2007) challenges this by arguing that technology is simply part of our nature; it is nature, inseparable from it. It may be a human byproduct, but not exclusively human. A frequency we can tune into, as can any other animal, even ants, for technology is as ancient as life on Earth (paraphrasing Gray’s chapter on Green Humanism). I would like to invite you, then, to dissolve the illusion of control we believe we hold over technology and AI, and to consider Gray’s argument as a starting point.
From an evolutionary perspective, we are programmed animals, driven to survive and reproduce like any other living being on Earth. Our cognitive and builder capacities, enabled by the partnership between our brains and our hands, have been our singular way of relating differently to nature. But why do we imagine this makes us God’s privileged creatures? Are we that different from Cyanobacteria if we consider this evolutionary perspective?
Consider cyanobacteria: their oxygenic photosynthesis drove atmospheric oxygenation, enabling the formation of an ozone layer. An extraordinary impact on Earth and the history of life, and yet, oxygen (O₂) was simply a biological byproduct. In the same way, could we think of knowledge, technology, and machine learning algorithms as human-animal byproducts? They may not transform the planet’s geology, but they shape its cognition, a vast impact across multiple realms, one capable of altering the very foundations of how we live and how we see ourselves, and, like oxygen, far beyond our control.
According to Gray, we're not the authors of AI any more than cyanobacteria were the authors of oxygen. We're the biological substrate through which a new form of information processing enters the world.
This shift in perspective, from creators to participants, clarifies our role in this development. If we are witnessing the emergence of a new layer in Earth’s information-processing stack, our role is not to control this but to understand our place within it. The cyanobacteria couldn’t prevent the oxygen catastrophe, but life found ways to flourish in the new atmosphere they created. What does it mean to flourish in an atmosphere we don’t control?
This perspective offers three insights about our relationship with AI:
First, like photosynthesis for cyanobacteria, technology and AI development are part of our nature. Neither is something we need to resist or control. More something to understand. Similarly to how oxygen became both a resource and a threat in Earth’s history, AI will serve both evolution and existing power structures. Our question is how to adapt to its emergence.
Second, if AI is a byproduct of human activity, a natural resource, a frequency we can tune into, then it cannot be truly owned and controlled by any entity. It is accessible to all of us. Companies in the business of AI are trying to sell us something that already belongs to us. They might enclose the technology behind paywalls, but the underlying cognitive processes belong to our species' collective development. This is not to say we will all benefit. Current development exacerbates inequalities and has already shown the ability to serve neofascism and totalitarianism. It seems the promise of controlling the masses is deeply seductive to our power structures.
Third, the marketing narratives that position AI as something beyond ordinary human comprehension and urge us to use specific products to not be left behind, misunderstand what's happening: we are not consumers of AI but participants in its emergence. The questions we should be focusing on are how we will adapt to the new cognitive atmosphere we are unwillingly creating. Like the early organisms facing rising oxygen levels, our challenge is not to control the change but to evolve with it.
Of course, there are huge differences between Cyanobacteria and us; we are far more complex organisms. That also does not mean we can not be the means through which new and more evolved organisms are born. Now, can we sit with the uncomfortable questions:
If intelligence flows across beings, what is it asking of us now?
If we are not tech and particularly AI authors, what happens to the pride and fear that come with ownership?
And if technology is just another unfolding of nature, could we meet it without the weight of thought, without the reflex to control?
Can we, perhaps, consider this cognitive technology as part of nature, as non-artificial, but as part of the ecosystem we’re living in? We may be becoming witnesses to a rising awareness that we are just insignificant, yet singular, creatures in the universe, and that is both terrifying and liberating; it is taking our collective identity out of the center of meaning.
2. Humanism’s Epitaph
If we consider intelligence as a non-human proprietary quality, and technology as an evolutionary step, a natural, animal byproduct, we can see our participation in this natural unfolding far from anthropocentrism. We need to deeply question our rooted and collective belief in free will so that we can let go of authorship and ownership. This is beneficial, since we’ll be left to modulate and interact with AI and cognitive technologies, perhaps with all technologies and tools, just as we interact with nature and other beings. I can see, smell, feel, and frame a deeper sense of responsibility arising from this playful exercise.
I am not immune to the weight of these ideas; I struggle with them even as I write. But perhaps that struggle is the beginning of loosening the grip of humanism. Perhaps we –the homo-sapiens– aren’t the center of Earth’s story, and just some insignificant, but singular creature connecting new forms of life and evolution. Perhaps we are in the business of creating a new atmosphere for future beings to breathe knowledge and wisdom as we do oxygen, even a new form of dynamic matter that breaks beyond our understanding. Then, do we want to freely choose to stop it from absorbing us and breeding more complex organisms? Or is that freedom itself another illusion?
Gray, John. Straw Dogs: Thoughts on Humans and Other Animals. New York: Farrar, Straus and Giroux, 2007.
I loved reading this piece, I feel more humble and empowered at once thinking we are mere participants or byproducts, rather than the ultimate creators. It also made me think of the argument that evolution is computation and thus, we didn't invent but we discovered computational modeling.
Yes, a very important point but there’s one more step I always try to include when mapping the course of Life through time. Language was a technological innovation that opened the door for the emergence of sociocultural beings. It is the resultant institutional ecosystem that is in the driver’s seat right now, determining what goals are worth pursuing and what the persistence of other life forms (including human) are worth. AI has the potential to significantly upgrade the cognitive capacity of institutional beings. That’s why I consider it so important to align government and corporate value schemes with goals that elevate the value of life on Earth, before it’s too late.