UEFA.com works better on other browsers
For the best possible experience, we recommend using Chrome, Firefox or Microsoft Edge.

Tackling online abuse at UEFA EURO 2024: Insights from the entire tournament

Discrimination should never be acceptable, whether in football or society, in person or online.

UEFA via Getty Images

During EURO 2024, UEFA ran a dedicated online abuse programme to monitor, report and remedy cases of online abuse in collaboration with Meta, TikTok and X.

First launched at UEFA Women's EURO 2022, the programme covers all UEFA finals and final tournaments, including the youth categories, and is set to run until Women's EURO 2025 in Switzerland.

UEFA can now reveal a summary of online abuse cases recorded throughout EURO 2024 in Germany.

Across the whole tournament, the programme monitored 696 social media accounts of all individuals involved in the tournament, including players, coaches, referees and participating national teams. Of all the abusive posts identified, 91% were actioned by the platforms.

"The protection of our players, coaches and referees from online abuse is not just a priority but a responsibility. UEFA's online abuse programme has demonstrated our commitment to creating a safer digital environment. The positive response from our social media partners and the participating national associations has been encouraging, and together we will continue to take care of football."

Michele Uva, UEFA director of social and environmental sustainability

With the tournament now over, a final summary is presented below:

  • A total of 9,142 posts were flagged for review in the group stage.
  • These posts came from 7,810 individual accounts, with 666 posts (7%) eligible to be reported directly to the social media platforms for further action.
  • 91% of those abusive posts were actioned by the platforms.
  • The teams most affected were Belgium, the Netherlands, England and France.
  • 72.5% of flagged posts were directed at individual players, 15.5% were directed at coaches, 6% targeted team accounts and 6% targeted referees.

While 25 languages were included in the monitoring scheme, abusive content was identified in 20 languages, including some not belonging to any of the participating nations.

The monitoring of Facebook, Instagram, TikTok and X has highlighted the types of abuse that players, coaches, officials and other accounts are being subjected to. Some 92.5% of flagged posts were reported for general abuse, which includes abuse that is not specifically targeted at a group or community. In addition, 5% featured racist abuse and 2.5% contained homophobic abuse.

Focus on the final

  • A total of 427 posts across social media platforms were flagged for review during the EURO 2024 final between England and Spain.
  • These posts came from 402 individual accounts, with 54 posts (13%) eligible to be reported directly to the social media platforms for further action.
  • The team most affected was England.
  • 84% of flagged posts were directed at individual players, 15% at coaches and 1% targeted team accounts.

After each match, the results were shared with the national associations of the teams involved, enabling them to escalate further action, and with the German law enforcement authorities.

The online abuse programme will continue to run across UEFA events for the 2024/25 season.

We strongly encourage everyone to join the fight against online abuse by reporting any abusive or discriminatory content to the respective social media platforms. Victims of online abuse or hate speech who are struggling with self-harm or suicidal thoughts should seek support from medical professionals. If online threats or comments cause fear for personal safety, contacting the police is advised.

Crucial collaboration with social media partners

UEFA values highly the fruitful collaboration with social media platforms to tackle online abuse during UEFA EURO 2024. Together, we established an escalation channel before and throughout the tournament to promptly report abusive content, with weekly meetings to assess cases.

In agreement with our counterparts, we share further insights on the outcome of this collaboration.

- On Instagram, selected profiles were monitored and 100% of abusive posts reported to Meta for further action were removed.

- Meta helped prevent abusive comments by activating a number of features on both Facebook and Instagram to moderate comments and limit unwanted interactions across accounts, including features like Hidden Words, Limits, Restrict and Moderation Assist.

- On TikTok, selected profiles were monitored and no abusive content identified. Before and during the EURO, a cross-functional group of dedicated safety professionals at the company prepared by:

1. Developing a comprehensive preparedness plan to understand and mitigate issues that could occur during the tournament and proactively safeguarding accounts before, during and after the tournament.

2. Strengthening moderation capabilities to respond to potentially hateful conduct.

3. Cooperating closely with relevant law enforcement agencies across Europe.

- On X, selected profiles were monitored and the entire platform was scanned for specific keywords. X was able to effectively review hundreds of posts throughout the tournament and take further action where needed.

Selected for you