A couple of days ago, Twitch released their first-ever Transparency Report. The lengthy document covers things like moderation, enforcements, and total reports over the course of 2020. Transparency Reports are relatively new – starting first in 2010 by Google. They have since been adopted by Facebook, Twitter, Apple, Microsoft, and other tech companies. But, while Twitch hosted a Creator Camp, and made it very visible for the community at large – we’re not actually the intended audience. Transparency Reports are aimed at regulators and advertisers, and aim to make the platform look good.
To make these statistics a bit more digestible for creators – we need to remove the PR spin and rebalance some stats against actual creator data. Thanks go to @pleasantlytwstd for creating an interesting thread about some of the data. She’s inspired me to write this post!
Moderation Coverage Decoded
One of the first things that stood out to me was the charts. They are all presented with unusual data comparison points which allow them to appear a bit rosier than reality. Twitch is top-heavy – meaning that the majority of content consumed is the biggest streamers on the platform. Let’s start with some raw data via Sullygnome.
- There were 18.6B hours watched in 2020.
- The Top 100 Channels made up 3.4B hours watched in 2020. (18%)
- The Top 1000 Channels made up 8.5B hours watched in 2020. (46%)
- 14.7M Channels went live in 2020.
It’s for this reason that the comparison of Minutes/Hours Watched is fairly useless. The way that data is presented HEAVILY skews to the top streamers on the platform, and none of them are without moderators.
If we look at this on its own… 1000 channels make up nearly HALF of all watch time on the platform. Meanwhile, with ~14.7M channels that went live, 14.669M make up the other half. That potentially means ~640,000 channels that had no moderation. Let’s go one step deeper…
- There were 769M hours streamed in 2020.
- The Top 1000 Channels streamed 2M hours in 2020.
- This means the Top 1000 Channels made up only 0.3% of the total hours streamed.
- Also, it means that 44M hours of content streamed was unmoderated.
The fact that Twitch has made Automod enabled by default is a positive change. I hope we continue to see education on moderation, and tools to support them.
User Reports and Enforcement Decoded
In 2020, Twitch received 13,325,734 reports covering everything Viewbotting to Terrorism. From these reports, they acted on 1,896,356 of them. That works out to 14.2% of all reports resulting in an enforcement action taking place. Is that good? Well, we’ll need to go deeper to find out.
The raw numbers are not presented, so the best I can do is base it upon the above chart. By pixel counting (an inexact science), I can make a rough estimate on the number of reports per category. (Take H1/H2 pixel height, divide by total reports = ~19K H1, ~20K H2. Now multiply 20,000 by total pixel count in each section.) Add Categories from H1+H2. This is what we get:
- ~6,500,000 Reports (49%) were about Viewbotting, Spam, and Other Community.
- ~3,900,000 Reports (29%) were about Hateful Conduct, Harassment, and Sexual Harassment.
- ~1,500,000 Reports (11%) were about Violence, Gore, and Shocking Content.
- ~1,100,000 Reports (8%) were about Nudity, Pornography, and Sexual Content
- ~300,000 Reports (2%) were about Terrorism.
How often did Reporting Lead to Enforcement?
The Twitch Transparency Report does go on to list the exact number of actions taken from those reports. This means we can get a general idea of how often Twitch did act on reports.
- 1,722,076 Viewbotting, Spam, and Other Community actions were taken, resulting in ~26.5% enforcement-to-report ratio.
- 80,767 Hateful Conduct, Harassment, and Sexual Harassment actions were taken. This works out to ~2.1%.
- 11,254 Violence, Gore, and Shocking Content actions were taken. That means only ~0.7% of such reports resulted in an action.
- 27,394 Nudity, Pornography, and Sexual Content actions were taken, which is ~2.5%.
- 87 Actions for Terrorism reports were taken, working out to ~0.03%.
Things like Viewbotting and Spam are clearly much easier for them to identify and act upon. Every other category is more nuanced and requires more thought before an action can be taken – leading to incredibly low punishment numbers compared to total reports.
Just for comparison sake – Snapchat enforced Harassment reports at a rate of 20%, and 36% for Sexual Content reports. YouTube removed ~8% of videos reported for Sexual Content, while also removing ~3% for Hateful/Harassing Content.
Facebook, Reddit and LinkedIn Transparency Reports also offer insight that Twitch currently does not.
Based on StreamerBans, I estimate that Partners received approximately 1800* suspensions in 2020. That works out to 0.1% of all punishments being levied against Partners. When considering JUST the Partner class, about 2% of them received a suspension in 2020. (Some partners were repeat offenders, receiving upwards of 3 suspensions)
*There was a surge of suspensions due to DMCA that I have subtracted to get this number.
How Twitch can Improve the Transparency Report for the Community
During the Creator Camp session, Both DJWheat and Angela Hession repeated that this report was just the first, and would evolve over time. As mentioned, other tech companies like Reddit, YouTube, and Snapchat offer clearer insight. Things like the number of accounts being actioned on, reaction times, and DMCA are all areas that Twitch should be transparent about. Additionally, breaking down the data to split the creator class from the viewer class would also be useful. While these reports are meant for advertisers and regulators – the Twitch community is way more engaged in these things. It would benefit Twitch to be more open and transparent in a wider array of data. This is a start, but I hope to see much more in the future!