With so much talk in the media lately about Digital ID — and the growing public concern around surveillance, privacy, and data control — I’ve found myself reflecting on how easily good intentions can blur into overreach.
While technology continues to promise safety, convenience, and efficiency, many people instinctively feel uneasy about the potential for misuse or intrusion. This tension between innovation and autonomy reminded me of a conference I attended a few years ago, where I experienced that same uneasy feeling first-hand.
I attended an online presentation by a space technology company called Astrosat, based in Musselburgh, who work in partnership with the European Space Agency. They shared some fascinating projects — using satellite data to map fuel poverty, access to green spaces, rural isolation, and community wellbeing. It was inspiring to see how technology could be used to inform positive change and direct resources where they’re most needed.
But then something caught my attention.
Among the ideas shared was a proposal to track people identified as being at suicide risk, using their phone’s location data to alert emergency services if they approached known “hot spots” such as bridges. The intention was clearly compassionate — to prevent loss of life — but I felt an immediate discomfort.
As a person-centred counsellor, my work is rooted in autonomy, trust, and unconditional positive regard. The idea of monitoring individuals without consent, even in the name of safety, felt like crossing an ethical line.
The human cost of overreach
In trauma work, we understand that safety is not something that can be imposed. It’s something that must be co-created through trust and relationship.
For someone already struggling with their mental health, the sudden arrival of emergency services — possibly lights flashing and voices shouting — could be terrifying and re-traumatising. Imagine simply walking home across a bridge, lost in thought, and suddenly being surrounded by responders because an algorithm flagged your location.
What might have begun as an act of care could easily be experienced as control, and for some, confirmation of their deepest fears — that they can’t trust the world around them, that their privacy no longer exists, that even their thoughts are being monitored.
Paranoia would be off the scale.
The heart of person-centred work is believing in an individual’s right to self-determination — even when they’re struggling. We walk alongside people, not in front of them, and we offer choices, not commands. Safety achieved through surveillance is not true safety; it’s a form of fear management.
Technology that listens, not watches
This doesn’t mean technology can’t play a role in supporting mental health. It absolutely can — but it needs to be grounded in consent, compassion, and collaboration.
Here are a few examples of how tech could help ethically:
- Opt-in digital companionship apps that people activate when they choose to, offering real-time human support or grounding exercises.
- Anonymous environmental mapping, where data helps authorities design safer spaces without tracking individuals.
- AI-assisted empathy training, supporting professionals to recognise emotional tone or distress in messages — not to monitor people, but to deepen human understanding.
- Digital regulation tools that teach simple nervous system calming practices like breathing, tapping, or the Voo sound — empowering self-regulation, not surveillance.
- Community-based safety networks, where awareness and care are built through relationships rather than algorithms.
Technology should never become the new gatekeeper of human distress. Its highest purpose is to extend empathy, not authority.
Protecting both life and liberty
When care becomes control, we lose the essence of healing — and replace it with compliance. True trauma-informed practice means protecting both life and liberty.
If technology is to be used in mental health, it must be designed in partnership with those who understand the lived experience of trauma, distress, and recovery. It must recognise that being watched is not the same as being seen.
As we continue to innovate, may we never forget that the most powerful technology we possess is still the human heart — capable of listening, understanding, and connecting in ways no algorithm ever could.
“We can design systems that protect without intruding, and save lives without silencing them.”
Deborah J Crozier
Founder at A Positive Start CIC
Trauma-Informed Practitioner supporting community holistic health and safety
Member of the Association of Child Protection Professionals and Chartered Fellow Member of the ACCPH