The advocate for New Zealanders mental health
BY Anna Ashton

Closing the digital harm gap

• 5 min read

I work as a Lived Experience Advisor in specialist mental health services in Canterbury, within Health New Zealand. Alongside that role I run my own project focused on digital harm reduction. I do this work because of my lived experience of online harm and the gap I discovered when I tried to get help.

Why I Stepped Into the Digital Harm Space

Ten years ago I walked into my therapist’s office trying to explain something they had never been trained to recognise. I had experienced digital harm and I needed help. I was already within services, already struggling, but the support I needed did not exist. The words I used were unfamiliar to them. There was no language for what I was trying to describe.

A decade later I work in that same service in Canterbury. I now fill the gap that was empty when I needed it most. Nothing had been created to guide our services in the ten years between those two points, and I still feel shocked by that.

Anna Ashton is a Lived Experience Advisor working alongside Specialist Mental Health Services in Canterbury by utilising feedback, amplified by her lived-experience to facilitate service, cultural and practice changes that better meet the needs of consumers and their family-whānau. Anna has a special focus on digital harm reduction due to her lived experience with online abuse and pro-harm communities. To fill the gap in assessment and treatment for online harm Anna has created a resource to help clinicians, whānau and even young people themselves understand, identify and respond to a range of online harms. 

Foreign to some, fatal for others

The consequences were significant for me. What I faced online led me into shame, distress and eventually suicidal behaviour. I found an online community that encouraged this. I also know young people who did not survive similar experiences. We recently had a case linked by the coroner to online grooming, involving someone in the care of the same service I was in at the time.

When I talk about digital harm, I mean the full range of what young people face.

Digital harm includes:

  • Distorted body image shaped by unrealistic online expectations
  • Gaming and gambling exposures
  • Sexual harm including grooming, catfishing, nudes and deep fakes
  • Mental health harms such as communities encouraging restrictive eating or dangerous behaviour
  • A growing set of online terms and norms that adults often do not know
Some of these harms are intentional, such as grooming or creating deep fakes. Some are unintentional, where young people copy behaviour they see online without understanding the impact, including sharing their mental health experiences in ways that become distressing for others.

When I first sought help, the system failed me for specific reasons.

What was missing:

  • No digital literacy training for clinicians
  • No awareness of the full range of online harms
  • Minimal research available
  • No assessment tool to screen for digital harm, equivalent to screening for depression or anxiety
  • No treatment guidelines 
People sometimes suggest bans as a solution, but I do not believe they work. Young people still engage with prohibited things, because the need they are trying to meet does not go away.

My concerns about bans:

Young people will still use social media, even if it is restricted and this will silence those who need help due to fear of punishment or further restriction. The youth who will find loopholes to get around the ban will be those who aren’t getting their social and emotional needs met in their homes and communities – our most vulnerable. These young people use social media as a resource to get these needs met. 

Harm reduction is more effective, teaching young people how to navigate the online world safely from the start, with emphasis on how to get their needs met safely. 

What I Have Built

The first thing I created was an online resource, developed after delivering in-person training for some time. I needed something wider reaching, something young people and adults could access directly. The resource explains the context that shapes how social media operates and how young people behave within it.

What the resource covers:

  • The financial motives of social media companies, including algorithms that push negative content to keep users online
  • The motivations that shape youth behaviour online, including copying peers and using technology to meet emotional needs
  • A non-blaming approach to understanding why young people do what they do
  • Advice for supporters on how to have safe conversations
  • Ways to educate at every age level
  • How to introduce age appropriate freedoms to ensure a trusting relationship between caregivers and their young people 
  • Support for young people who are not yet ready to talk
  • An introduction to several forms of online harm – to understand, identify and respond to them
  • Strategies to reduce harm when things go wrong

Did you know?

Online harms included:

  • Unrealistic standards that fuel negative body image
  • Mental health content
  • Fight videos
  • Leaks, nudes and deep fakes
  • Grooming and catfishing
  • Gaming, gambling and porn
  • Misogyny
  • Chat rooms
  • Dark romance

The resource is an educational tool for both adults and young people. It is a useful tool for parents, teachers, caregivers, school counsellors, specialist clinicians and NGO kaimahi, and of course young people themselves.

Here's how it stacks up.The introductory section is aimed at adults, helping them understand the broader context. The middle section is written directly for young people, although adults watch it too. The final section brings everything together.

My background is in psychology. I completed my honours in research psychology, which shaped the way I built the resource. I combined lived experience with the limited research available. My resource also links to further resources such as: 

  • The Light Project
  • Digital Waitaha 
  • Netsafe
  • Keep It Real Online

I’m also very grateful for the input from my colleague Sarah Tomes, a Whānau Advisor. I write from a youth perspective, so her perspective as a parent raising two young people in a digital world was essential. Additionally I'm in discussion with a professor in digital research at Auckland University to review and validate the resource. 

What Needs to Happen Next

I have created something useful, but to take it from a prototype to a national resource, I need specific support.

Can you support me with links to the following

Academics and workforce bodies, including Health New Zealand and the Ministry of Education, to review the resource for safety and suitability
A body or organisation willing to lead the creation of national guidelines on digital harm reduction
Support to get the resource into schools, community groups, GPs and mental health services

I also need funding for two other elements of the project, a couple of tools that are designed to directly offer support, over and above the education module described above. They are:

  • A digital health assessment tool, equivalent to screening for depression or anxiety
  • A safe and moderated platform for young people to speak about their experiences
The moderated platform is particularly significant. It would act as a live data map where young people can safely record what they have experienced and receive automated validating responses. It would replace the unsafe and dysregulated communication many young people currently rely on. It would capture experiences across all young people, not only those able to enter services.

I have the academic support to supervise this work, but I do not have the funding. That remains the barrier.

My call to action is simple.

  • If you have influence within the Ministry of Education or the Ministry of Health, meet with me and review the resource
  • If you know of funding options for the assessment tool or the platform, connect me with them

My final word is directed at government.

Digital harm will never be removed completely, but the flow can be reduced.

If financial penalties were imposed on social media platforms until they removed the negativity bias in their design, we could create a different online culture. Those penalties could fund education, research, assessment tools, treatment programmes and postvention supports.

Over time fewer people would need them, but they would be there for anyone who does.That is the strategic choice I believe government must make.

Other posts you might be interested in

Horizon Newsletter

The advocate for New Zealander's mental health

Sign up for free