The advocate for New Zealanders mental health
BY Shane Muller

SafeWatch

• 4 min read

Let's not lose what Makes Us Human

We met Shane at TheMHS 2025 here's our report about what he had to say

I am an entrepreneur. Everything I do is driven by solving real problems and making a positive impact. That is what led me to mental health.

In 2019, I started looking closely at what was happening. I was listening to stories and comparing them with the data. What I saw was confronting. The number of people choosing to exit life was deeply troubling, especially among young people. What concerned me most was the trajectory. Even before COVID, it was clear we were heading in a dangerous direction.

There was a pattern in the stories. People would say, “I had no idea,” or “I didn’t think it was that bad.” This was not about a lack of care. It was about missing the signs.

We are living in a faster world.

We do not have time to read those signs properly. At the same time, people present differently depending on where they are. At work, at home, with friends, online.

We have created multiple versions of ourselves. The risk is that no one, including those closest to us, knows which one is real.

That is where I started. Most mental health tools are built for the individual who is struggling and is aware of it. But when someone is at their lowest, they often lose awareness and motivation. They give up on themselves. That is why they do not engage with those tools when they need them most.

So I asked a different question. Who is best placed to notice when someone is struggling?

The answer is not the system. It is the people around them.

The people in your life can see you late at night. They can read your silence. They notice what you say and what you do not say. They act because they care, not because it is their job.

I started thinking about this as a village. Everyone has one, even if it looks different. Within that village is a vast, largely invisible group of carers. Some are professionals. Many are not. Friends, family, people with shared experience.

This is the largest support network we have, and it is not being used intentionally.

From that came Safe Watch.

SafeWatch™ offers a secure, community-driven mental health platform designed to empower individuals on their wellness journey. By intersecting innovative digital health technology with compassionate human care, SafeWatch creates a "Village Approach," connecting individuals with their trusted networks for support while preserving privacy and ensuring proactive mental health management.

The idea is simple. If we can bring together the perspectives of the people in someone’s life, we can create a clearer picture of how that person is really doing. No single person sees everything. A parent sees one part. A teacher sees another. A friend sees something else.

When those views are combined, patterns emerge.

We built what I describe as a mirror. A way for someone to see how they are showing up across different parts of their life. Without that visibility, there is no awareness. Without awareness, there is no change.

This is not about replacing care. It is about strengthening it. It is about making sure no one has to say, “I didn’t know.”

Technology is not the answer on its own.I am deeply involved in AI. I build in this space. But when it comes to mental health, there is a line we cannot cross. Human relationships must come first.

Technology should sit on top of those relationships, not replace them. If we create systems where people no longer need to connect with each other, we lose something fundamental. We risk a future where validation comes from a programmed device rather than from another human being. That is not progress.

The challenge is that this shift is already happening.

  • People are using AI for support. That is not hypothetical. It is real. And like any new technology, there will be failures. With something like this, those failures may not be visible. They may happen quietly, in isolation.
  • We also need to recognise that AI is a tool. In the right hands, it can support wellbeing. In the wrong hands, it can be used to influence and control people. The same capability that could motivate someone to take care of themselves could be used in harmful ways.

Most people are not thinking about that.

Policy, ethics, and morality are not keeping up with the pace of development. The technology is already moving. It is an avalanche. We do not have the luxury of time, but we also cannot afford to ignore the risks.

There are also structural realities we are not confronting.

Building and maintaining these systems is expensive. It is not a one-off investment. It is ongoing, complex, and permanent. Once we start, we cannot simply stop. At the same time, AI relies on data. Large amounts of it. People are already providing that data, often without fully understanding the implications. That data can be used to predict behaviour. It can also be used to influence it.

We are effectively turning human experience into data points.That should concern us.

 So the question is not whether we build these tools. We already are.The question is how.For me, the answer is simple. We have to be clear about who we are building for. If you are designing policy, writing code, or shaping these systems, you need a reference point. Someone real. Someone you care about.

Because that is what is at stake.If we get this wrong, we do not just build ineffective systems. We risk undermining the very thing that supports mental wellbeing in the first place. Human connection, once that is gone, no amount of technology will bring it back.
SafeWatch

Other posts you might be interested in

Horizon Newsletter

The advocate for New Zealander's mental health

Sign up for free