• Audrey Tang

    These days, more and more people are exhausted by gleaming AI-polished social media posts. The messaging is emotionally charged, and every sentence seems to make sense. Yet, by the end, nothing sticks in the mind. Information keeps multiplying, as does FOMO. The result is not deeper understanding, but faster burnout.

    This disturbing state of affairs is connected to a question I kept facing at Oxford University's recent Civic AI Conference. In conversation after conversation with participants from around the world, I was asked: If mainstream AI is optimized for maximum efficiency and maximum attention extraction, can we choose a different path?

    Not long ago, in Dharamshala, India, I was blessed to glimpse a possible answer. Several geshes involved in Monlam AI, scholars trained to the highest level of Tibetan Buddhist philosophy, showed me the model. What makes it valuable is not that it poses as an all-knowing teacher, but that it grows out of a community's long, shared care for language, scripture and the lived context of practice.

    The model can help draw more people closer to Buddhist teachings. But the geshes were crystal clear: No matter how well a model writes, it cannot replace lived experience. AI can generate sentences that sound profound, but this does not mean it has actually passed through confusion, discipline and insight.

    They also shared a Tibetan folktale. A man is racing along beside his horse, panting, but refuses to mount his steed. Asked why, he said "I'm in a hurry. I don't have time to stop and get on." Many now approach AI the same way, sprinting after every generated output. The moment you start competing with AI on speed, you are not riding the horse. The flow of information is dragging you along.

    On this point, the Dalai Lama offers a clear standard: "Ultimately, AI is a tool ... We should use tools not for control, but to improve human relationships."

    This also helps explain the quiet impatience so many people feel toward AI-generated content. AI has crudely pried apart rhetorical fluency from what is actually worthy of our emotional investment, turning concern into a hollow performance, a form of care-washing. Learning to recognize that fracture is becoming a core literacy of the digital age.

    That is why I keep my smartphone and computer screens in grayscale, so that the world beyond the screen remains more compelling than the screen itself. But personal discipline is only the beginning. At the social level, we need a different kind of AI: not some lofty, all-knowing intelligence, but local, bonded "little helpers" that assist people to listen more closely to one another, repair relationships and then step back.

    Instead of falling prey to AI's endless output, we should learn to stop, get on the metaphorical horse and ride. Amid the constant churn of modern attention, we need to take the reins of our tools and reclaim agency over how we connect with real human beings.

  • (Interview and Compilation by Yu-Tang You. License: CC BY 4.0)