AI, Presence, and Acceleration:
Provisional Notes from a Spiritual Practice and Coaching Perspective

A situational snapshot
This text is a snapshot of my current reflections on Artificial Intelligence (AI) – written from the perspective of a practitioner and teacher in spiritual contexts and of a 1:1 coach for academics and high‑level professionals, not as a comprehensive theory of digitalization. It deliberately highlights only a few aspects that currently move me: questions of presence, intimacy, healing, and the subtle dynamics of acceleration that AI may intensify, especially in the context of coaching and mentoring relationships.
The starting point was a teachers’ meeting in which we discussed the opportunities and risks of AI in spiritual contexts. Since then, I have been asking how our understanding of relationship, guidance, and awakening – and, more concretely, of coaching, mentoring and guidance – is altered as AI increasingly enters spaces of counseling, therapy, and professional development.
How “holistically” can AI see the human being?
A common critique of AI is that it can only see people in a fragmented way – as data points, not as whole persons. This critique is valid, but it is only part of the truth. The more often we use AI, the more contextual information it accumulates about us: linguistic patterns, interests, biographical fragments, situational moods. With every interaction, the profile it constructs becomes more fine‑grained and complex – at least on the level of information and probability.
In 1:1 coaching and mentoring, this is especially relevant for clients who already work with AI on a daily basis. Many of them use AI to structure their thinking, generate options, or prepare for high‑stakes decisions. On top of that, people tend to ask AI systems deeply intimate questions – questions they might never address to a colleague, supervisor, or even a coach. We voluntarily grant the machine access to our vulnerabilities, our shame, our secret desires. In the best case, we receive precise, highly individualized answers that distill complex knowledge within seconds and present it in a way tailored to our cognitive style and prior knowledge.
In this sense, AI can become an ambivalent place: both treasure and tragedy. A treasure, because we gain access to resources, reflections, and structuring help that might otherwise never have been available – particularly in the demanding worlds of academia and high‑responsibility professions. A tragedy, because we entrust this knowledge to an instance that may “know” us but does not sense us.
What AI cannot offer: Embodied wisdom and lived resonance in 1:1 work
However impressive the adaptability of language models may be, in my view they lack something essential: deeply embodied, lived wisdom and the dense, multilayered communication between two living beings. In spiritual traditions this often becomes visible in the relationship between teacher and student. In coaching and mentoring, a similar logic applies: the coach or mentor does not only respond to the explicit content of a question but to the condition of the whole person – their history, their current emotional regulation, their bodily presence, their patterns of self‑protection and self‑sabotage.
Even if AI were someday able to process biographical data and the full interaction history in great depth, one crucial dimension would remain inaccessible: the somatic fine‑tuning of two nervous systems in real time. An experienced coach or mentor senses – consciously and unconsciously, interoceptively and exteroceptively – the regulatory state of the other: activation, collapse, overwhelm, openness. On this basis they can offer an intervention that not only “makes sense” cognitively but is actually digestible for the system in that moment. For high‑performing clients this can mean, for example, not offering yet another optimization strategy, but first attending to exhaustion, perfectionism, or subtle self‑attack.
Psychotherapy research has shown for decades that treatment success depends less on the specific method and more on so‑called “common factors”: relationship, alliance, empathy, warmth, shared expectations. Similar findings are increasingly discussed in coaching research: the quality of the coaching relationship, perceived empathy, trust, and psychological safety are central predictors of outcome, often more so than the specific model or tool used. The therapeutic literature on alliance and emotional co‑regulation shows how the emotional states of client and therapist measurably attune to each other over the course of sessions. This bodily resonance – this fine‑grained co‑regulation between nervous systems – is central to processes of healing and sustainable change.
AI‑supported formats, including AI “coaches,” are currently not able to reproduce this level of relational complexity. Language models can simulate empathy, but they do not bodily participate in emotional experience; they can appear responsive without ever being at risk in a relationship. They do not know fatigue, rupture, repair, or the subtle mutual transformation that takes place when two humans stay in a difficult conversation.
When I speak of presence in coaching and mentoring, I therefore mean more than attention or cognitive focus. Presence is the quality of a nervous system that is both grounded and available – capable of resonance, of being affected, of holding ambiguity without immediately moving into problem‑solving. In my view, AI cannot be “present” in this sense. And if presence is the central ingredient of intimacy and transformative work, then AI can be a powerful support tool, but it cannot replace the core of 1:1 coaching and mentoring.
Pseudo‑connection and the risk of subtle hollowing for high performers
One of my greatest concerns is that we might gradually confuse conversations with AI for real relationship – that we mistake the illusion of intimacy for the experience of intimacy. For academics and high‑level professionals, this can take a very specific form: using AI as a confidential thinking partner for topics they do not feel safe sharing in their institutional environment – career doubts, ethical conflicts, experiences of marginalization, burnout symptoms.
In the short term, such interactions can feel relieving, clarifying, and efficient. They can help structure complex problems, generate options, or translate diffuse discomfort into language. In the long term, however, there is a risk that we habituate to a kind of “connection” that does not truly see us, does not truly challenge us, and does not truly meet our vulnerability. The result can be a subtle hollowing: a life full of intelligent decisions and optimized workflows, but with too few places where we are actually met as a whole person – in our uncertainty, our confusion, and our longing.
For 1:1 coaching and mentoring, this raises an important question of positioning: Do we present ourselves merely as more sophisticated problem‑solvers than AI, or do we consciously claim a different space – a relational, embodied, deeply human space that AI cannot occupy? My own answer leans toward the latter. The added value of coaching is not primarily faster information or smarter strategies, but a protected field of presence, co‑regulation, and honest reflection, in which a person can slowly come into deeper resonance with their own values, limits, and sources of meaning.
AI, acceleration, and the logic of efficiency in professional life
Beyond the relational dimension, a second structural dynamic concerns me: the entanglement of AI with the regime of acceleration in modern society – and specifically in the lives of academics and high‑level professionals. Many of my clients already inhabit environments defined by permanent time pressure, escalating expectations, and an implicit imperative to be constantly available and productive. AI fits seamlessly into this landscape.
AI allows us to do in seconds what used to take hours or days: drafting proposals, summarizing literature, creating teaching materials, preparing complex emails, generating strategic scenarios. On the surface, this looks like a liberation. Subtly, however, the baseline of what counts as “normal” output shifts upward. When the technically possible becomes the new minimum, we are drawn even more deeply into a performance logic: faster, more, smoother – with less and less room for reflection, integration, doubt, slowness, or genuine rest.
This is where my coaching perspective intersects with my spiritual one. If we uncritically integrate AI into already accelerated professional contexts, we risk reinforcing exactly the dynamics that bring so many people into coaching in the first place: chronic overwhelm, disconnection from the body, loss of meaning, a painful gap between inner values and outer demands. AI can become yet another tool of self‑intensification – another way to override the signals of fatigue, misalignment, or quiet inner resistance.
In contrast, coaching and mentoring can offer a counter‑movement: spaces in which efficiency is not the highest value, in which stillness is allowed, and in which the question “What is actually beneficial and healthy – suitable for me?” is taken as seriously as the question “What am I expected of me?” In this sense, the task is not to reject AI, but to integrate it in a way that serves resonance rather than alienation.
Discernment in using AI and choosing coaching
Against this backdrop, one capacity seems especially central to me – for myself as well as for my clients: discernment. Not in the sense of moral panic about AI, but as a fine, continuously renewed clarification:
-
Where does AI genuinely support me – for example, in structuring complex information, brainstorming options, or reducing cognitive load?
-
Where does it tempt me to bypass human relationship – to avoid difficult conversations with supervisors, colleagues, or loved ones?
-
Where does it reinforce a logic of self‑optimization that actually takes me further away from my own body, heart, and sense of meaning?
In my own work with academics and high‑level professionals this translates into a few practical commitments:
-
I invite clients to use AI as a tool, but not as a substitute for 1:1 conversations in which we can explore shame, fear, and longing in a truly safe relational space.
-
I explicitly frame coaching as a place where we slow down – where we resist the pressure to “optimize” everything and instead listen for what wants to emerge when there is no immediate demand for performance.
-
I encourage clients to become very conscious about which questions they delegate to AI and which questions they bring into human relationship – including coaching, mentoring, therapy, and trusted collegial exchanges.
I do not believe that AI will ever fully replace human connection – it lacks presence, intimacy, and the willingness to be transformed. At the same time, it is powerful enough to reshape our ways of being‑in‑relation and working in profound ways. Whether this transformation leans more toward healing or toward alienation will crucially depend on whether we remain able to shape our lives – including our relationship to technology – from the heart, and whether we continue to create and protect spaces of truly human encounter in coaching, mentoring, and beyond.
Suggested Reading & References
- Cuijpers, P., Reijnders, M., & Huibers, M. (2019). The Role of Common Factors in Psychotherapy Outcomes. Annual Review of Clinical Psychology, 15, 207–231.
- Wampold, B. E. (2015). How important are the common factors in psychotherapy? An update. World Psychiatry, 14(3), 270–277.
- Reich, C. M., et al. (2019). Coregulation of therapist and client emotion during psychotherapy. Journal of Counseling Psychology.
- Frontiers in Psychology (2025). Specific and common therapeutic factors in psychodynamic psychotherapy.
- Floridi, L., & Chiriatti, M. (2020). GPT-3: Its Nature, Scope, Limits, and Consequences. Minds and Machines (for a critical discussion of language‑based AI systems).
- PMC (2025). Limitations of Artificial Intelligence-Based Tools in Psychotherapy.
- PMC (2025). Artificial intelligence-based psychotherapy: focusing on common factors.
- Turkle, S. (2011). Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books.
- Rosa, H. (2005). Beschleunigung. Die Veränderung der Zeitstrukturen in der Moderne. Frankfurt a. M.: Suhrkamp.
- Rosa, H. (2016). Resonanz. Eine Soziologie der Weltbeziehung. Frankfurt a. M.: Suhrkamp.
If you want to create more alignment, freedom and agency in your being and your life, reach out to me.
If you want to learn hands-on methods of empowerment, connect to your unique qualities, stepping into your full potential, experiencing excitement when you face the next challenging thing, reach out to me.
If you want to experience nothing but admiration, pride and love for the person you are seeing when you look into the mirror, reach out to me.
All your qualities, vulnerabilities, all your parts are more than welcome.
Write to melanie@energetic-efficient-empowered.com to get in touch and schedule a free video call to see if we are the perfect fit for a 1:1 coaching container. I’m looking forward to hearing from you.
Read more about asking for help as a superpower in my comprehensive article about this important and powerful topic.
Boost your confidence & energy!
Want to stay connected & receive a free gift right away?
Sign up to receive our free content, tools and updates that support you to master the art of stepping into your power and full potential, experience confidence and purpose, and live your best life... with pleasure and ease.
With your sign up you will also receive a powerful free gift that passionately encourages you to grow.
