Posted By

Tags

Christchurch massacre presents case for regulating Social Media

Paul Brislen

Wellington, March 25, 2019

Calls for social media to be regulated have escalated following their failure to act decisively in the public interest during the terror attacks in Christchurch.

The cry has been growing ever louder over the past few years.

We have seen Facebook refuse to attend UK parliamentary sessions to discuss its role in the Cambridge Analytica affair, watched its CEO testify but not exactly add any clarity to inquiries into Russian interference in the US election, and seen the company accused of failing to combat the spread of hate speech amid violence in Myanmar.

US representatives are now openly talking about how to break up the company and our own prime minister has suggested that if Facebook can’t find a way to manage itself, she will.

The offshore issue

But how do we regulate companies that don’t have offices in New Zealand (aside from the odd sales department) and that base their rights and responsibilities on another country’s legal system?

And if we are going to regulate them, how do we do it in such a way as to avoid trampling on users’ civil rights but makes sure we never see a repeat of the events of March 15?

Politicians have traditionally been rubbish at regulating the internet, and not just local ones. While the EU got its laws regarding privacy absolutely right it is also currently grappling with two new regulations that will destroy the ability to share content online because it doesn’t seem to fully appreciate how the internet actually works. And then there’s Australia, which has introduced controversial new laws about encryption.

Inherent risks

There is every danger that we will overstep the mark and regulate the social media and tech giants in such a way as to make our own lives worse than they were before, and that’s something that needs to be taken into account before we start.

Let us start by making it clear that if these companies want to operate in New Zealand, they must abide by New Zealand law. Shouldn’t be too hard since they all say “oh yes, we always operate under local legal constraints” wherever they are in the world.

In Germany, for instance, with its harsh penalties around Holocaust deniers and Nazi symbols, Twitter and Facebook and Instagram and all the rest manage to avoid upsetting people on a regular basis by filtering out such content on a regular basis. If they can do it in Germany, they can do it here.

So what laws do we currently have in place that might provide a platform to work from?

Current Legislation

In New Zealand we have the Films, Videos and Publications Act to protect us from the type of content nobody really wants to see. If the content meets the criteria, it’s deemed objectionable and anyone caught with it, or caught sharing it, can expect a hefty fine and jail time.

But prosecuting individuals caught actively sharing the video of the Christchurch mosque shootings under the Act isn’t likely to prompt changes in the social media platforms themselves.

We could start by making this Act more applicable to the content hosts as well as to the uploaders. Currently, under the Harmful Digital Communications Act there is a safe harbour arrangement. If you do the right thing by the law and act quickly to remove the content, we’ll let you go about your business. I would like to see that beefed up.

Let us see how quickly they can respond, make it mandatory to report on a quarterly basis how many complaints they receive about content and how they acted on each complaint.

Fixing time frames

Let us put a time frame in – say 24 hours to assess and remove content. Let us put in some real incentives as well – rather than a $10,000 fine let’s move to a model that will really get their attention. How about $50 million or 4% of global revenue per offence?

Let us not leave the decision-making on what is and is not objectionable up to of minimum wage monitors based in the US who do not know New Zealand laws.

Let us require that the community standards applied to New Zealand content for New Zealand users are based on New Zealand law.

And if we are going to have live streaming video footage uploaded by anonymous individuals, let’s have a look at how best we can monitor and manage that. All video to be tagged with a hash, for starters. This is a couple of lines of code that identify the video so if it needs to be pulled from public view it can be found and removed quickly.

Moderators required

And let us have actual moderators looking at actual live feeds with the power to hit the “dump” button and remove content if it’s offensive. Social media is fantastically quick to remove copyright material (and indeed material that it thinks is covered by copyright law) but incredibly slow to act on everything else so let’s change that dynamic.

Let us hold senior leaders to account for any breaches of the law – just as we have introduced personal liability for company directors.

Our Privacy Act

Privacy is an area that needs strengthening as well.

Our Privacy Act is currently being reviewed but in light of the events of last week it probably needs to be looked at through a new lens. The Privacy Commissioner needs to be able to act decisively and act with some force.

While we are at it, let us introduce a tougher financial reporting regime. Facebook made around $800 million from New Zealand users last year so let us see it pay tax locally.

There is work underway on this – I would like to see it accelerated and scaled up significantly.

Ideally we would work with our counterparts around the globe.

We need to work together with other jurisdictions to make sure these companies are compliant and don’t simply move virtually to another location.

Social Licence

All companies operate under a social licence. We give Facebook and Twitter, Instagram and WhatsApp a huge amount of data about us and they make a huge amount of money from us, and most of that is because we allow them to.

If they are not going to play fairly then the ultimate penalty is to take our ball and go home – uninstalling the app, refusing to pay for advertising, removing ourselves from the equation may be the only option that actually makes a difference.

But let us try the regulatory approach first.

Paul Brislen is a Technology Commentator. The above article, which appeared on Radio New Zealand website, has been reproduced here under a Special Agreement with www.rnz.co.nz

Additional Reading: Impact of Article 13 of the EU Copyright Directive: Survey of 1500 video game streamers about how they would respond to the proposed filtering of copyrighted material before it is posted online. https://comparite.ch/article13survey

 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share this story

Related Stories

Indian Newslink

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide

Advertisement

Previous slide
Next slide