Sports

Twitter and racist abuse in football: how can it be tackled? | Paul MacInnes


What has sparked the debate?

On Monday Paul Pogba missed a penalty for Manchester United against Wolves and the match ended as a draw. Shortly afterwards, a number of messages – some purporting to be from Manchester United fans – appeared on Twitter abusing Pogba. They used the “n” word, replaced Pogba’s face in photographs with that of a gorilla, and told him to “go back to selling bananas in Colombia”. Soon enough, the tweets were called out by United fans and reported to Twitter. The next morning United issued a statement condemning the messages: “The individuals who expressed these views do not represent the values of our great club.” A number of United players posted messages in solidarity with Pogba and the issue of racist abuse on social media platforms became a public talking point.

How bad is the problem?

It is almost impossible to quantify, but the situation is certainly not good. The abuse directed at Pogba was not even the first high‑profile example of the season, with Chelsea’s Tammy Abraham and Reading’s Yakou Méïte having already been targeted. The anti‑racism organisation Kick It Out has attempted to calculate the total number of such posts and in 2015 produced a study that found 134,000 abusive messages had been sent to football players and clubs from August 2014 to March 2015. The number of discriminatory incidents on social media reported to Kick it Out has since grown to record levels. And this, remember, is just about abuse in football.

Tammy Abraham was another target of racist abuse.



Tammy Abraham was another target of racist abuse. Photograph: Richard Calver/Rex/Shutterstock

What is going to happen?

Although some of the abuse has been on other platforms – Méïte, for example, received messages on Instagram – it is Twitter that has become the focus of the most criticism. The company has agreed to meet United, Kick it Out and “other civil society stakeholders” in the coming weeks to discuss what plans the company has for taking more proactive action against racists.

What are the proposed solutions?

United’s Harry Maguire tweeted one possible course of action: “Every account that is opened should be verified by a passport/driving licence,” he wrote. The England Women’s coach, Phil Neville, suggested a six-month boycott of social media by clubs and players. Twitter’s proposed course of action is more iterative, looking to beef up its reporting and suspension processes. The company talks about having tripled the number of accounts banned within 24 hours of being reported, and increased the number of messages “surfaced proactively for human review”. The company would not confirm how many humans it has doing the reviewing, and admits “progress in this space is tough”.

How likely are the solutions to work?

This is where it becomes complicated. Twitter’s policies have many critics, especially among people of colour and other minorities on the sharp end of abuse. One common complaint involves the terms of Twitter’s “hateful conduct policy”. There are areas here that could be tightened up but, increasingly, there is a sense among Twitter’s critics that more must be done.

Bigger changes such as that suggested by Maguire have their own challenges, however. Presenting identification would require new complicated systems not only to process identification but verify it. The end of anonymity (even if it still existed at the user level) would impact on everyone from whistleblowers to, once again, vulnerable minorities who might be more reluctant to speak their mind. Finally, and perhaps most obviously, would the idea of handing crucial pieces of information over to tech companies be popular with the public?

Harry Maguire



Manchester United’s Harry Maguire suggested that all Twitter uses verify their identity, though would people be happy to hand over their personal information? Photograph: Peter Powell/EPA

Why is this proving so complex?

There are several things in tension with each other, the most glaring of which is an apparently public forum actually being a privately owned platform. The needs of society are not always aligned with Twitter’s interests as a business. However, the company admits that “this behaviour harms the Twitter experience for everyone”, and says “we remain deeply committed to improving the health of the conversation on the service”.

Another conflict is obviously that between one person’s right to freedom of speech and another’s right to live their life in peace. Some of this comes down to context, not just the language used but what it appears in conjunction with and when. Is a gorilla’s face hateful? Not in and of itself, but when pasted on to the body of a human being, the message it sends is different and clear.

What is the best hope for change?

At this point it seems unlikely Twitter will do anything other than promise to do more, more quickly, and perhaps clarify or alter parts of its “hateful conduct policy”. Stemming the apparent ease with which banned users can reappear under a new alias would also be welcome. It remains to be seen whether this will prove enough to appease critics, who are likely then to demand action from government.

One further thing that could be done is to increase the number of legal prosecutions. At the time of writing, Greater Manchester Police are yet to receive a complaint about the messages directed at Pogba and are therefore not investigating. Online hate crime is thought to be seriously under-reported, but prosecutions are growing, have a high rate of success and stiffer sentences are being delivered. One way of tackling online hate would be to enforce real world consequences.





READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.