The trouble with social news

2013-01-24

There is something terribly awry with the social news ecosystem. This is a feeling that's been growing on me over the last few years, and is the reason why I've cut both Reddit and Hacker News (who together constitute pretty much all of "social news") out of my information diet. Although I've mulled over things in various conversations, I've never actually tried to put my feeling of unease in writing, until today. What's spurring me into action is a proposal by Yann LeCun that a model similar to social news be adopted for scientific peer review - self-assembled Reviewing Entities voting on streams of submitted papers, regulated by a reputation system for authors and reviewers. Basically, this is science a la Reddit: complete with subreddits, karma and upboats. I find the idea frankly terrifying.

I guess it's time, then, to put finger to keyboard and lay out what disquiets me about social news.

Karma Corrupts

You start by introducing a reputation mechanism like karma to improve some outcome - say, to increase the quality of comments, or to apply a threshold to restrict voting to trustworthy community members. This seems like a plausible and even elegant mechanism at first, until you discover the terrible side-effects.

Humans are fundamentally status-seeking social apes, and you've now introduced a visible measure of social worth that people will be driven to maximize. In the real world, we have a word for those who spend their lives accumulating karma - we call them politicians. And so, within karma communities, we see the rise of a political class - persuasive centrists who cater (perhaps unconsciously) to a constituency, and who express (perhaps eloquently) opinions calculated to appeal to the masses and avoid controversy. Hacker News and many subreddits are dominated by people like this, whose comments are largely predictable and rarely add anything new or unexpected to the conversation.

At the bottom end of the food chain, we have a different class of creature with the same basic aim as the politicians, but without the persuasive charm needed to pull off the political approach. These are the karma whores, who use a mixture of frank pandering, provocation and calculated outrage to achieve the same aims.

The karma maximization game often acts contrary to the goals we aimed to achieve by introducing karma in the first place: the tenor of the community suffers, the diversity of opinion declines, and the karma whores post pictures of their cats everywhere.

The Lossy Sieve

Go and have a look at the new story submission queue on Hacker News. Scroll through a few pages, and pay attention to the stories stuck at one vote - they will most likely never receive another upvote and will die in obscurity. Now, go look at the front page. When I do this exercise I'm struck by the fact that there's plenty of crap on the front page, and quite a bit of good stuff in the submission queue languishing in obscurity. So, quality can't be the sole metric here - what determines what gets onto the front page and what doesn't?

Lets try a thought experiment. First, set up a small number of voting accounts - say, 10 or so. Now, in the new submission queue, pick 5 random stories every hour, and give them a small number of upvotes soon after they are submitted. I predict that you will find that stories that received this small initial boost are vastly more likely to end up on the front page. If I'm right, then chance dominates story selection - as long as an article exceeds some basic quality threshold, it all depends on who happens to see the story soon after it is submitted, and whether the spirit moves them to vote. Note that this is not the case at the extremes - frankly bad content won't be upvoted, and really important stories will usually find their way to the top. The lossy sieve phenomenon affects everything in between.

What this boils down to is that social news doesn't provide an effective filter - good content gets lost, and mediocre content finds its way onto our screens.

The Pinhole Effect

In social news, the front page is king. Most users never go beyond the first or second page of top stories. However, front-page real estate is incredibly limited compared to the volume of submissions on most popular subreddits and on Hacker News. The effect of this is that we're looking at a fast-flowing river of information through a pinhole. Even assuming that the selection mechanism works flawlessly, what you see on the front page is a small sliver of the total, chosen through a consensus mechanism that takes no account of individual variation in tastes and interests. The news you see is not tailored to you - it's tailored to some abstract, average participant, with all the rough edges of individuality smoothed away. The effect of this is that even at its best, the stories that emerge from the social news system feel like a predictable pablum dished up by the hivemind. The subreddit system tries to improve this by allowing communities to self-assemble around interests, but the pinhole effect still dominates in busy subreddits like /r/programming.

Gaming The System

Social news systems are eminently gameable, and cheating is rife. Part of the reason for this is that a story's destiny depends on a relatively small number of votes. If your story has any merit at all, you significantly increase the likelihood that it will end up on the front page by giving it a small nudge at the beginning of its life. If it has no merit whatsoever, you can still force it onto people's screens with a few tens or hundreds of votes. Conversely, you can use the same effect to censor and oppress views you disagree with if your social news site has downvotes. Anyone who's kept an eye on these things can rattle off examples of gaming in action: the voting rings, the "social media consultants", the vigilante thought-polizei, the political operators, and dozens of other types of manipulation and villainy. What's more - these visible scandals are just the tip of the iceberg. Eyeballs are valuable, and there's an active arms race with social news sites on the one side, and a dark army of spammers, scammers and true believers on the other. How much of what we see is affected by this type of cheating? We just don't know, but my suspicion is that the effect is significant.

The point here is broader than any particular instance of gaming. It's that social news sites are structurally susceptible to manipulation in ways that can't be fixed without changing the core of their operation. A system like this might be good enough to deliver rage comics, but I feel queasy trusting it any further.

Community Collapse Disorder

My final beef with social news is a problem that it shares with pretty much all online communities, especially technical ones. We're all familiar with the life-cycle of technical forums. They start with a small community of insiders who create value, which then attracts more people to participate, which then dilutes the quality of the contributions (and often introduces a few pathological bad actors), which then causes the good contributors to move on, which causes the magic well to dry up. Everyone then take their toys and move to the next community, and the cycle repeats. We saw this with Usenet and the original C2 wiki, and we are seeing it now with Hacker News and many technical subreddits all at various points in this life-cycle.

I believe that Community Collapse Disorder is one of the Big Problems online that we don't yet have a satisfactory solution to. People are trying, though. Hacker News, for instance, seems to be rather poignantly aware of its own decline, with some of the best of the old-timers calling for an alternative. Paul Graham himself recognizes the issue, and has been tweaking things in various ways to combat the phenomenon, without much success.

At the moment, we just don't know how to build online communities that are both inclusive and stable. Democracy, here, seems to lead inevitably to decline, and social news sites are no exception.

A better way forward?

A big part of the reason I don't use social news anymore is that my existing social networks have become so much more effective at turning up good content. The absolute best source of news for me is simply the set of links shared by the folks I follow on Twitter. I follow people who post interesting content, and whom I trust to act as information filters for me. Most of them share my technical interests, but some are interesting because they are from my home town, or because they share some more esoteric pursuit with me. So, the news stream I see is exactly tailored to me. At the same time, there is also room idiosyncrasy - if someone I follow shares something left-field that tickles their fancy, I'll see it. In turn, I try to be a responsible information filter for those who follow me - I find a link or two worth tweeting on most days.

There are still things I miss - Twitter is great for sharing links, but is an awful medium for technical discussion. Google+ could be a better alternative, but just doesn't seem to have achieved liftoff for me. I would also love better tools for aggregating and harvesting links from my social network. At the moment I use Flipboard and Prismatic, but I have issues with both. On the whole, though, these are quibbles. It seems to me that using social networks to filter news is a better way forward - if I was tackling the social news problem, I'd be building tools to support this process.