By Eleisha Foon of rnz.co.nz and is republished with permission. Additional information by Kaniva News.

The Prime Minister of Tonga says social media negatively affects many people.

His concern on Facebook last week follows a plan in Australia to ban children from using social media amid concerns that platforms like Instagram and TikTok are negatively affecting young people’s physical and mental health.

It also comes in the wake of New Zealand’s Broadcasting Standards Authority (BSA) report, Freedom of Expression and Harms Impacting Diverse Communities, which surveyed 493 people, found about a third of Māori, Pasifika and Muslim groups reported reading, seeing or hearing offensive, discriminatory, or controversial views shared publicly in the past six months.

Social media was, by far, the outlet which people criticised most for harmful content, followed by free-to-air TV, then online news sites, a report by RNZ Pacific said.

The BSA discovered half of the diverse audiences avoid broadcasts due to perceived racist comments, anti-Māori views, biased commentary on the Palestine/Israel conflict, or references to people being labelled criminals or terrorists.

“News media are generally doing a good job of upholding broadcasting standards, but there’s a broader sort of societal issue that we need to tackle, which is around social cohesion and kindness,” BSA chief executive Stacey Wood said.

There were serious impacts seen to result from offensive views being expressed publicly. On a community level, it was seen to “normalise” bad behaviour, potentially impact on the aspirations of particular communities, and “perpetuate negative stereotypes”.

“If you’re constantly being bombarded with negativity, discrimination it does impact on your self-esteem, your mental health,” Pacific mental health organisation Le Va chief executive Denise Kingi-‘Ulu’ave said.

Only a minority of those surveyed feel New Zealand has the right balance between freedom of expression and potential harm, while a majority feel freedom of expression needs to be tempered by the need to respect the views of others.

About 56 percent Māori, 60 percent Pacific, 45 percent Asian, and 41 percent Muslim believed stronger limits on freedom of expression are needed to prevent harm.

While social media is seen as the most prevalent and harmful source of offensive content, the report suggested themes conveyed on mainstream media are seen as helping to legitimise it.

The “relative anonymity” of talkback radio and social media is seen as encouraging more extreme views to be voiced and “fewer boundaries in place.”

“I think that social media is a big problem because of the lack of regulation,” Wood said, adding the BSA had “been calling for regulatory reform for 15 years now, because our act was written in a time when the internet barely existed”.

The BSA urged the New Zealand government to consider regulating online spaces to fill the current void of non-regulation on social media platforms.

Kingi-‘Ulu’ave said the report was an “accurate reflection of what we’re hearing and experiencing within Pacific communities”.

She said politicians also had a responsibility to not fuel ideologies or belief systems that could cause discrimination, hate speech or spread stereotypes.

Kingi-‘Ulu’ave said when New Zealand Act leader David Seymour “made disgraceful comments directed at Ministry of Pacific peoples”, they “saw threats and even people entering the ministry of Pacific peoples building that was right next to us in Auckland”.

She said it caused “significant trauma to the staff there”.

“We are very concerned about the lack of accountability and responsibility,” she added.

She said media literacy was crucial for Pacific communities to “critically engage with media content and distinguish between trustworthy sources and harmful, making sure that we have a platform for our communities to go and feel that they are represented and that we can help shift that narrative and empower them to speak against harmful content”.

Targets of disinformation campaign’

New Zealand Muslim community leader and human rights activist Anjum Nausheen Rahman said there was a lot of reporting by organisations purporting to be media that operate solely in the online space and are not bound by broadcasting standards.

“The sheer scale of discrimination should be an eye-opener, but there is also an economic argument,” she said.

“These communities are disengaging from platforms and there is a business imperative in being inclusive. These people are then turning to unregulated media and becoming targets of disinformation campaigns.”

Academic Malini Hayma said news media had a role to play in improving the way society views diverse community groups within New Zealand.

“I listen to Newstalk ZB, and there have been instances where I was appalled by the conversations aired, as they lacked understanding and sensitivity toward diverse communities.

“Additionally, much of the content feels overwhelmingly Eurocentric and fails to reflect the diversity and inclusivity that should be represented in today’s media.”

She said research backed the importance of people seeing themselves represented in media.

“It fosters a sense of belonging and challenges biases.

“In a diverse country like New Zealand, the under-representation of Māori, Pacific, and Asian communities leads to feelings of alienation and distrust.

“To address this, media must prioritise inclusive and authentic storytelling that reflects the experiences of all communities.”

Controversial views

All respondents recalled instances of offensive, discriminatory or controversial views.

However, many could not recall the specific source of the content.

When inciting conflict, the report noted “Destiny Church encouraging negative actions”.

Stereotypes being reinforced, such as Māori/Pacific Peoples’ ‘all on a benefit’, ‘not academic, low IQ’, ‘simple people’, ‘migrants taking all the jobs’.

Unbalanced reporting was called out for the “Israeli/ Palestinian conflict and initial assumptions that violent events are terror attacks by Muslims.”

‘Stick to bright lines’

The report findings have been criticised by the chief executive of The Free Speech Union, Jonathan Ayling, raising concerns it would encroach on free speech.

“The BSA should stick to bright lines, and not play wanna-be-censor. We’re concerned that this report will lead to unpopular views being censored on the basis of them being ‘harmful’.”

He said censoring unpopular views does not get to the root of difference or division.

“When speech incites imminent violence, we have appropriate laws in place. Human dignity is harmed by being unable to think and speak freely.”

Making a BSA complaint

Only a few respondents had acted when they had heard offensive views in public broadcasting.

The report highlighted the need for a more simple and faster process when laying complaints to the BSA.

“We have taken notes and we are making changes to make the complaints process less complicated,” Wood said.

However, a several Pacific Peoples’ and Muslim groups noted that their culture did not encourage complaining and “causing trouble”.

Pasifika had the highest response rate to experiencing harmful content compared to others, but when it comes to laying a formal BSA complaint, respondents say they don’t because it goes against culture.

The most common response to experiencing offensive viewpoints was talking to family and friends, followed by making a complaint to the broadcaster, comment online and a complaint to a government body.

The Human Rights Commission is the first port of call for most, followed by the BSA.

Global crackdown on social media companies

The findings come at time when governments around the world are looking at ways to crackdown on social media giants to protect young people being exposed to harmful content.

The Australian government this week revealed plans to restrict social media use among children.

In France, instant messaging app Telegram’s chief executive has been arrested and the closure of X (formerly Twitter) in Brazil are two of the latest signs that times are changing, with networks beginning to be held more accountable.

The EU has put in place a legal framework that is starting to be applied.

In the US, lawsuits and new laws that are underway could affect the future of large social media platforms. The latest addition to this regulatory framework will arrive in 2026, when the AI Act comes into force.

You can find the BSA’s Freedom of Expression and Harms Impacting Diverse Communities report here.