Instagram is now warning users about their offensive captions

This is just the latest step in the platforms fight against online bullying.

by Tom George
17 December 2019, 12:31pm


Instagram has just announced its latest effort to stamp out online bullying on its platform. Now, users will begin to receive warnings when captions for their posts could be considered offensive or insulting. While the offensive words will still be sharable anyway, the warning will suggest that users make edit in order to foster a more supportive environment on the platform.

Of course, warnings won't stop the most determined of online trolls, but Instagram have said that they’ve “found that these types of nudges can encourage people to reconsider their words when given a chance.” Earlier this year, the company introduced similar AI tools that asked users if they were sure that they wanted to post offensive comments, as well as ‘proactively’ searching for bullying in already posted photos, comments and captions.

After being ranked in 2017 as the worst platform for cyber-bullying, the platform has since begun altering its user experience in ways that could help protect the mental health of its users, by trialling the hiding of likes, banning plastic surgery filters and posts related to self-harm, and allowing users to restrict content to ‘close friends’.

The new captions warning has already been rolled out for some users in select countries but, according to Instagram, it will go global “in the coming months”. Here’s hoping for a more supportive Insta feed in 2020!

mental health
Social Media