Posted

Helping you stay safe on PlayStation Network: How PS4’s reporting system has evolved since 2016

Detailing the evolution of our reporting system to make it better and easier for you on PlayStation

Following on from the introduction of ‘Play Time’ in our recent 5.5 System Software Update – and as the first in a series of weekly articles – we’re going to start looking at the ways we can help you stay safe on the PlayStation Network.

Specifically, we’re going to focus on how the reporting mechanisms have been developed, allowing you to tell us in more detail than ever about the user-generated content you don’t want to see on PSN.

How our reporting practices changed back in 2016

In 2016 Safety and Moderation teams undertook a wholesale review of the report options available to players on PSN.

You might remember what these looked like back on PlayStation 3 (below) – in those days we used the term ‘grief report’ – and, while these did the job back in the day, they were very much of their time.

TITLE

For PlayStation 4 generation – and to reflect the wealth of social features that came with that – we felt reporting was in need of a refresh.

What we did to research the reporting system

  • Extensive data analysis of reports submitted by PSN users – what were you reporting, why and where?
  • Research into the emerging trends in this field, both in terms of online gaming and social media – was there anything we could learn from or ideas that would spark our imaginations?
  • Finally, and in order to ensure what we ended up with reflected your views, we reached out to our users on the official PlayStation forums and beta trial community.

Rolling out the updated reporting system in System Update 4.00…

The final spec we initially rolled out in System Software Update 4.00 had some real standout features:

  • We now gave you multi-layered reasons, to allow players to be more specific about exactly what the report was about
  • We provided more information about how reporting works, explaining when to use which option and how a human manually reviews every report we receive
  • We gave you some options to quickly resolve your issue – for example, we created easier and more intuitive ways to block other players who, for whatever reason, you preferred not to communicate with

TITLE

  • We even gave you a free text box so you could explain the issue to us in your own words

Around the same time we also implemented system messaging for reports, both to let you know when we’d received your report and to notify you once we’d made our decision.

TITLE

…and its immediate positive impact on reporting as a result

We immediately saw a hugely positive impact from these changes.

For one thing, by allowing you to better understand how and when to report content to us, we saw the accuracy of the reports we received improve – for example, far fewer of you were now reporting Online IDs that, after our review, did not seem to contain any Code of Conduct violations.

This meant that on average the time it took us to respond to your reports reduced considerably, allowing us to solve the issues that really mattered to you quicker.

One year on and these changes are still yielding further improvement: from the information you told us about in the free text part of your reports, we were able to make a change in System Software 5.00 that made it easier for you to find the report option in Messages.

Whilst nothing is ever perfect, and we continue to look for new ways to improve all of our PSN safety features, we hope you agree that this was a significant step in the right direction.

In our next article on safety, I’ll provide more information on what we do once we receive your reports – and hopefully bust some myths that have developed around the subject.

Read more in our “Helping you stay safe on PlayStation Network” series:

10 Comments
2 Author replies
razbivachaslavev 16 March, 2018 @ 17:49
1

jason-bridges 16 March, 2018 @ 17:52
2

I do love a good reporting feature! Been using it to get rid of chat bots and it’s been months since my last so clearly some improvements have happened. Good job!

3

I used to get around 5-10 bot messages weekly on PS3. Most of them were sent to the block list and I do understand you need this reporting feature, but it can be used by very bad people in a abusive way. Just looking at Youtube, Facebook and Twitter, how many people are abusing those system makes me sceptic to the whole concept. I do wonder if in this case people responsible for those report will stay bias or because of their politics and/or idealogy those reports will be judged differently.

Robert Lewington 17 March, 2018 @ 14:51
3.1

Stay tuned for next week’s article Nolidior88 – we’re going to tackle this exact point! As a sneak peak, I totally share the view that it’s vital these systems are not abused, and that’s why a human moderator reviews every report. More next week :)

JediKnight246 17 March, 2018 @ 12:33
4

Good to hear about positive improvements. Since PSN is now the most popular online service on console, it’s even more important to keep it from becoming toxic. Improving the reporting system to make it easier for Sony to remove problematic users helps.

4.1

YES!

5

I would like to make a suggestion in relation to this, and I hope it is at least noticed and considered.

When the Communities feature was first implemented, someone posted ‘apparent’ inappropriate content into a Community I had ownership was. I woke up one day to find I had an email stating I was going against terms of service for something that was posted.

Now, I found this horribly unfair that I, the Community owner, had to take the blame for something that someone else posted on my page, now that is a mark against my account for something I never did, even worse, I never even got to know what said content actually was.

Simply put, my suggestion is to implement a Community [Report to Admin/Moderator] feature, so members can report alleged content to the owener as a 1st line defence, so that they may review and remove content first, as it’s unfair that the owner should suffer for what others do.

In addition, there also needs to be an option to decline member requests to Communitie...

Show full comment

I would like to make a suggestion in relation to this, and I hope it is at least noticed and considered.

When the Communities feature was first implemented, someone posted ‘apparent’ inappropriate content into a Community I had ownership was. I woke up one day to find I had an email stating I was going against terms of service for something that was posted.

Now, I found this horribly unfair that I, the Community owner, had to take the blame for something that someone else posted on my page, now that is a mark against my account for something I never did, even worse, I never even got to know what said content actually was.

Simply put, my suggestion is to implement a Community [Report to Admin/Moderator] feature, so members can report alleged content to the owener as a 1st line defence, so that they may review and remove content first, as it’s unfair that the owner should suffer for what others do.

In addition, there also needs to be an option to decline member requests to Communities, as I have a lot of inappropriate join requests and bots that won’t accept in just sitting there in a list. Blocking them does not remove them either.

Thank you for your time.

5.1

Seconding the report to moderator suggestion. There are cases where something might not be worth reporting to SIE Moderation but is still something that doesn’t belong on the community or is causing friction. A few months ago there was a user flooding a comm I was in with random screenshots & other off-topic stuff, & it wouldn’t have been right to report them to SIE because they were clearly just being overly-enthusiastic rather than malicious, but it was still annoying & was burying on topic posts. But a random user telling someone off can face backlash for what is seen as backseat modding. Unfortunately the mod of the affected comm had privacy settings that prevented anyone not friends with them from messaging them to give a heads-up, & I ended up biting the bullet & being That Person. I did get other people chiming in with agreement & the user realised they’d screwed up but I felt like I’d kicked a puppy or something.

Robert Lewington 19 March, 2018 @ 08:57
5.2

Hi MaxDiehard & motifsky, thanks for this. We’ll look into the Report to community admin/moderator suggestion, sounds like it could be a useful addition. I’ll pass on the feedback about rejecting community membership requests too. Good stuff, thanks!

Shame you haven’t fixed the issue of mods abusing power on the forums and on the blog.

Pointless. I’ve reported people 5 times since PS4 launched where they were clearly in breach and every time it comes back as “we found no breach of terms, we’re sorry you felt this was the case…”. I don’t even bother reporting people now thanks to the pathetic people following up these issues.

7.1

I agree, in practice this “overhauled” reporting system is totally useless.

We close the comments for posts after 30 days.

Edit history