Patreon, a membership platform that makes it easy for content creators to obtain donations from their followers, recently changed their pricing structure, a move that sent their community into uproar. The company did a good job by quickly apologizing and reverting to the old pricing scheme. This is a story that we've seen play out time and time again. An online network tries to make changes in the community and users revolt.
While Patreon is the latest example of this trend, Reddit has been the historical poster child. The self-proclaimed front page of the internet is made up of a legion of unpaid moderators who are Reddit's biggest asset as well as its biggest liability. When moderators feel disrespected, perhaps by Reddit's founders or other execs, mods can make their sub-reddits private in protest, effectively shutting them down. Reddit lacks any real leverage in these situations. The famous story here is when Reddit fired Victoria Taylor, the main facilitator of AMAs (ask me anything) between various sub-reddits and celebrities. One particularly hostile AMA with Reverend Jesse Jackson is rumored to have led to her ouster. True or not, the community was on Victoria's side and the move felt like a slap in the face to the thousands of moderators out there who felt out of the loop. This incident aside, there is constant tension between Reddit and its users, typically around policing of content and the management of policies and rules.
YouTube also illustrates this diametric opposition between making money and keeping the community happy. As I write this, the online video site is going through what many are calling an "adpocalypse". We have finally reached the point where digital ad spending has surpassed offline media spend, and brands are being extra vigilant with where their ads are being shown online. As it turns out, AT&T, Verizon and Pepsi don't like their ads showing up next to terrorist propaganda videos or jokes about the holocaust on YouTube. YouTube is facing a brand exodus, and as a result Google is facing a projected $750M revenue loss in the near term. Naturally, Google is moving quickly to address the problem. The strategy du jour is "de-monetization". Basically, YouTube is cracking down on channels with offensive content, launching premium ad networks with more desirable inventory for brands, and providing advertisers with "brand safety tools" for more control. However, implementation has been rough. CPMs have purportedly fallen from $25 to $3, and many legitimate creators are wrongly getting flagged with little to no recourse. You see, YouTube has to deal with 400+ hours of content uploaded every minute, and their machine learning process for automatically tagging videos is not without issues.
Users hate this kind of one-size-fits-all policing of content. It begs the question, at what point do networks such as YouTube and Reddit go from being free and open to censored? At what point do they start looking like corporate networks pushing an agenda that fits their monetization goals?
In building online networks, Part 1 is to build a user or creator community, and Part 2 involves monetization. Part 1 is hard enough and many companies die trying (Vine is a spectacular example). I'd argue that Snapchat is still somewhat in Part 1 given that they need to show more user growth and have admitted that they need to do a better job focusing on creators. Part 2 companies such as Reddit and YouTube, but also Facebook and any other large scale network, have such large user bases that they effectively represent human nature, both the good and the bad. Fake news, harassment, bullying, and offensive content are par for the course. There are no easy answers, but I suspect that increased transparency with the community, better tools for advertisers and users, and improvements in machine learning will all help ease the pain.