YouTube Wants Creators To Be Responsible, But Lacks Tools To Do So

First Blog post of the new year. I hope everyone who had a good christmas and any other holiday. Now we have to talk about another tech company annoying me.

youtube's video is unavailable icon.

Like literally minute one of the new year, Youtube videos are being changed to Limited Moneitized and Mature content because of new policy changes. Which in typical youtube fashion was not communicated or if it was, it was buried under less iportent update emails like Handle changes. This was first reported by Cr1tical:

And then have an 4 days later:

(Warning ahead of time Btw… A lot of videos inbound).

The Increase In The Onus On Creators

I think the wildest thing about this is that the customer support Charlie was talking to did not know of this changes without going digging. I get YouTube is a big company, but you would think there would be a memo on their slack channel or something. That's wilder to me then just the fact that this was “snuck in” as some YouTubers felt.

Now there reasons for this change could be numberous. More pressure from advertisers being put on skechy videos. The Gonzalez V Google Case that brings into question if an algorithm suggesting and/or Recommending content counts as an endorsement of the content from the company (That's whole other topic for another day and given Charlie’s videos I can bet good money that played a role in these policy changes). One the algorithms had a 0.01% change and everything went side ways and more wonky enforcement.

(A video on Gonzalez V Google Case if you some how have missed it.)

The reason for this change doesn't matter too much in the grand scheme of things. It's not nearly as important on how youtube enforcing the policy, and the tools to help comply.

Another example RTGames has several videos limited and age restricted. The crazy thing is that it started with one video and then after asking help from youtube more videos got limited as well due to the new policy changes. Shotty communication as well. This had gotten so bad for RT games that he straight up is telling people not to ask youtube to help as it might make problems worse.

I don't think obfuscation should be a reasonable solution. At the end of the day, independent creators and businesses should be working with the site not against the site. It should not be that creators feel as though not only are they going to be punished arbitrarily for trying to reasonably resolve an issue, but have the status of their channel made worse because of their efforts to resolve said issue.

Since COPPA brought the hammer down on YouTube, Creators are expected to have even more responsibility over their content than ever and that trend is starting to increase. To a point this does make sense, Creators should take on some responibilty for the content they produce and the message they are communicating. The problem I've always had with this is the tools to comply in a balanced way, and creators tend to lack meaningful control in key aspects of content creation. For instance, a “Family Friendly” video is not necessarily for kids, the same way that having some violence does not make something 18+. There's a lack of middle ground. As for meaningful control, can't pick and choose what effectively ads could appear on a Creator’s videos as seen in this old video by Hank Green:

Yet The Current System is Too Restrictive, and Lacks Nuance

I want to suggest something is something that comes to many as so obvious. Youtube should have more than binary age rating, and should give creators the means to put content warnings.

I expect some critics of this, and maybe I'm alittle Naive but hear me out. There's more to age ratings than just for kids, Approprate for all ages, and 18+ Mature. There's nuance. Teen Titians GO, Family Guy, The Simpsons, South Park Rise of the TMNT, Love, Death, And Robots, are all animated, but for intendented different age demographic nad contents different kinds of content and content intensity. Same could be said for live action shows like Malcom in the Middle, Two and a Half Men, NCIS.

Content Warnings is one way we can better categorize content without it going full “Limited”. Again not all violence for example is Doom Eternal or that terrible The Callisto Protocol, sometimes it's merely cartoony like like Rumbleverse or Sly Cooper. As much as teen would dislike this I can't deny that gameplay content has to play a role in determining age rating along iwth creator supplementary content.

And demographics are wide on YouTube. Place where you can find a video on just about anything and everything (all of time). It only makes sense that these audiences are approatiedly accomodated.

Plus, the algorythm in all of its “Wisdom” love getting information about. Wouldn't this better serve content to right the people in what they're looking for. It's not like youtube isn't trying to figure out who YOU are all the time, so what not better sync with that isntend of the AI purely making that decision. That is unless youtube only cares about clicks, and outrag- I mean engagement.

The reason I make the arguement for Content warnings is to help users, and advertisers better be informed for what they wish to watch. It's not like this would be hard to implement as there already some video I've found that has a viewer discretion disclaimer. It would be helpful to have that for self imposed content warnings. Ideally, a user would be able to customize when these content warning pop ups occur, and only have the warning in a collapsible side bar, and maybe filter the age rating similar to NewGrounds.

This is sort of inspired by Rebecca Parham getting dinged for her bullying related video. While I think it's, for lack of a better word... Stupid, that talking about a serious issue being discussed in a genuine way is deemed as demonitizeable, I can at least sympathize that serious discussions might need that sort of destination even if the content is ultimately appropriate for 13+ or even younger thus being falsely set to limited by the system. A creator talking about necessary topics should be able to do so without the whole video going out of whacky, especially if the creator is known to “play ball.”

Another argument is that not everything made with more edgier, serious, adult, or the like bent to them is in bad faith (Ie being a “Gotcha”, in poor taste). There many videos that take conversations about sentitive topics like bullying, and mental health that are handled tastefully. I would like to think because of a content warning and age rating system advertisers and YouTube would be less likely to instantly pull the “Limited monitzation” trigger as both would know what they're getting into and can better gear their ads towards that. I would imagine that it would still be something that happened as you have intisapate bad actors (Creators hanging too close to edge, truely vail content ect.), and mistakes happen so purely relying on creator to correctly label their content 100% of time would solve the issue. but by have things be more incremental and have nuance in the system, I would like to think this would happen less.

Through self-moderation in being clear what an advertiser, algorithm, or user can expect, this could help in making a system that is less assuming of the worst case scenario of that content's topic, subject, or material. That is unless that creator has been proven by the community and moderation to repeatedly act in bad faith.

Now while I am talking about some infurtructual changes we can not ignore on elephant in the room. Youtube's unevenness is policy enforcement. This feels random to a a lot of YouTubers but this seems to manifest in two main ways:

  1. Small or Independent creators recieve more scruitny and less support than larger creators or established brands. This includes Less harsh punishment if any on those larger channels on average.

  2. While all youtubers have complained about the policy changes, there has been anwareness in it disprortionally affect minority communities such as the LGBT+ and BIPOC. These complains are not exclusive to youtube but it is present. Ironic give the kinds of radicalizing garbage that can seen on the site if you were a new user. (Go on YouTube shorts and just scroll if you never had before. I dare you.)

(This video is often shared around as a point in number 2) Regardless of policy changes, there must be an acknowledgement of this uneveness enforcement and addressed, else not a whole lot changes, or worse youtube devolves into video Elon's twitter. And in case you've been living under a rock, nobody wants to be an Elon twitter. Now is this a perfect solutions? No, probably not. There's still the chance of false positives but with more variables now, assumes that Creators will act in good faith most of the time by correctly labeling their content, and I am saying this that if YouTube moderation had to step in that their style would be fair and even handed which current trend have shown the opposite (with this last one being possibly needing internal changes). And frankly i'm sure those that actually create Content as their profession might have even better ideas than a Canadian who merely spends way too much time on the internet, but the point still stands. If YouTube, or any major User Generated Content Site, and Governments are going put much of the responsiility on Creator, there should be tools to properly act as such. One way being tools to best self moderate the content for a given audience and user advisory. It's only fair. #YouTube #RTGames #PolicyChanges Update 01/07/2023: RTGames has made a video with about his experience and wow the on 18+ policy is even worst that I thought. Which in to be able to watch Content under 18+ restriction you need provide government ID depending on the country. While I can understand Google's reasoning for this makes the current changes more worrying and a stickier situations than I initially thought. I still stand behind my statement made in this blog post as I remain firm that content should not be treated so black and white. Video Below:

A Nemes Content Blog 2022.