*I taught this reflection virtually to Seminary Alumni, about the reality of calculated and meditative thinking.
Every day, we interact with a dizzying array of screens, algorithms, and artificial intelligence. It can often feel as though we are just trying to keep up, adopting new gadgets to make our lives a little more efficient. But we are not merely using new tools; we have moved into an entirely new kind of culture. As we navigate this digital era, our goal shouldn’t be to run away from technology in fear. Instead, our task is to figure out how to mindfully integrate these tools into our lives without losing touch with the profound beauty of our shared humanity.
If a computer program can mimic human logic and write flawless sentences, we have to ask ourselves a deeper question: What happens to the emotion, the character, and the spirit of our shared human calling?
The Surrender to the Machine
To understand the water we are swimming in, it helps to look at the work of cultural critic Neil Postman. He suggested that human history has moved through three distinct stages in its relationship with technology.
First, there were Tool-Using Cultures. In these societies, tools were invented to solve specific physical problems—think of the water mill to grind grain, or the plow to turn the soil. Tools also helped build things of great beauty and symbolic meaning, like cathedrals. But the tool always served the deeper values of the community. The hammer didn’t dictate how people lived; the people used the hammer to build their lives.
Next came Technocracies. In this stage, tools began to shift the culture itself. When the mechanical clock was invented, for example, it didn’t just tell time; it changed how humans thought about their days, dividing life into strict, measurable hours and minutes. In a technocracy, human traditions and new technologies live side by side, often in an uneasy tension.
Finally, we arrive at our current state: Technopoly. Postman describes this as a culture that has surrendered completely to technology. In a Technopoly, we stop looking to human wisdom, tradition, or the divine for our answers. Instead, we look to the machine. We begin to trust data and algorithms more than we trust human intuition or experience. We measure our success by metrics, likes, and output. In a subtle but profound way, the tools become the boss, and we begin taking our orders from them.
Two Ways of Thinking
If Postman helps us see the cultural waters we are swimming in, the philosopher Martin Heidegger helps us understand what this does to our own minds. Heidegger suggested that there are two fundamental ways humans engage with the world:
The first is Calculative Thinking. This is the mindset of efficiency. It plans, investigates, organizes, and computes. It looks at the world—and even at other people—as resources to be managed, optimized, and put to use. Calculative thinking is incredibly powerful. It builds bridges, cures diseases, and powers our smartphones. AI is the ultimate engine of calculative thinking.
The second is Meditative Thinking. This mode is entirely different. It isn’t about solving a problem or maximizing output. Meditative thinking requires dwelling, waiting, and listening. It is the posture of standing before a beautiful piece of art, comforting a grieving friend, or sitting in quiet prayer. It doesn’t produce an immediate “product” or a quick fix. Heidegger beautifully described this posture as “thinking as thanking”—a way of being deeply receptive and grateful for the world around us.
The danger of our technological age is that it demands we use calculative thinking for everything. When we rely on AI to write our heartfelt emails, craft our speeches, or outline our creative projects, we risk treating human connection as just another data problem to be solved efficiently. We bypass the quiet, sometimes difficult work of waiting and listening.
The Loss of Our Internal Compass
When we rely too heavily on the calculating power of machines, we face what some scholars call “moral deskilling.” Think about what happens when you use a GPS app every time you drive. Over time, you lose your own internal map of the city. You forget how to navigate on your own.
A similar thing happens to our hearts and minds when we outsource our deepest reflections to algorithms. Artificial intelligence operates by predicting the next most likely word based on billions of patterns on the internet. It is highly sophisticated, but it operates purely on probability. It has the structure of language, but no real understanding. It has syntax, but no soul.
The messy, inefficient process of wrestling with a difficult idea, struggling to find the right words to comfort someone, or sitting in the silence of study is not just a hurdle to overcome. That process is the point. In that struggle, our character is formed. If we skip the difficult process of preparation and let a machine do the work, the final product might look polished, but it will be hollow. We don’t just lose a practical skill; we lose an opportunity for spiritual and personal growth.
The Power of the Particular
Because AI draws on the vast ocean of the internet, it represents the “average” of human thought. But true belonging and human flourishing never happen in the abstract “average.” They happen in the beautiful, messy, specific details of real life.
An AI can write a brilliant, perfectly structured essay about the concept of grief. But it does not know the specific pain of the neighbor down the street who lost her husband last Tuesday. It cannot read the room. It cannot look into the eyes of a friend and offer silent, loving presence.
A machine can mimic the sound of meaning, but it is up to us—living, breathing human beings—to actually make meaning together. Building a community, supporting one another, and finding our purpose are practices that require a body, a voice, and a shared history.
Reclaiming Our Human Center
We cannot banish the tools of our age, and we shouldn’t try. The way forward is not to smash the computers, but to mindfully navigate the tension between the calculative and the meditative.
We can absolutely use the calculating power of technology to handle the administrative drudgery of our lives. Let the machines manage the spreadsheets and organize the data. But we must fiercely protect the sacred, quiet spaces of our relational and intellectual lives.
We have a choice to make. We can let technology become the ultimate authority in our lives, or we can put it back in its proper place—as a tool that serves our deeper values. By intentionally making space for meditative thinking, we reclaim our agency. We choose to wait, to listen, and to give thanks in a culture that constantly shouts at us to compute and produce.
Let the machines handle the efficiency. We are called to something much more profound: the beautiful, irreplaceable, and wonderfully inefficient work of love, presence, and belonging.
Bibliography
On the Sociological Critique of Technology
- Postman, Neil. Technopoly: The Surrender of Culture to Technology. New York: Knopf, 1992. (Source of the three stages of culture: Tool-Using, Technocracy, and Technopoly, as well as the concept of the “surrender of culture.”)
On the Ontological Critique of Technology
- Heidegger, Martin. Discourse on Thinking. Translated by John M. Anderson and E. Hans Freund. New York: Harper & Row, 1966.
- Heidegger, Martin. The Question Concerning Technology, and Other Essays. Translated by William Lovitt. New York: Harper & Row, 1977.
- Heidegger, Martin. What Is Called Thinking?. Translated by J. Glenn Gray. New York: Harper & Row, 1968.
On Moral Deskilling and the Philosophy of Technology
- Borgmann, Albert. Technology and the Character of Contemporary Life: A Philosophical Inquiry. Chicago: University of Chicago Press, 1984.
- Vallor, Shannon. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting. New York: Oxford University Press, 2016.
On the Mechanics and Critique of Artificial Intelligence
- Bender, Emily M., Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell. “On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? ” Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency, March 2021, 610–623.
- Vaswani, Ashish, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. “Attention Is All You Need.” Advances in Neural Information Processing Systems 30 (2017).





Leave a comment