Roughly adapted from lectures at Blogtalk 2 (Vienna, 2004) and Blogtalk Downunder (Sydney, 2005).
Progressives enjoy an important technological advantage. To some extent, that advantage is inherent: progress is about the future. But in the US, that structural advantage is augmented by the Republican war on science. Evolution, vaccination, fluoridation, conservation: all are despised by the Right. Our last Senate campaign featured an attempt to smear Elizabeth Warren by connecting her to >*gasp*< Harvard, a place that used to be a Republican stronghold.
Senator Cruz has a list of a dozen Harvard Law professors who are Communist advocates of the violent overthrow of the US government. If we’re very nice to him, he might show us that list someday.
It’s no surprise, then, that progressive campaigns have an easier time finding talent to build web sites and forums and to exploit all sorts of new media. We have a deeper bench.
But our world is imperfect, and so is technology. We have our immediate woes from Waltham, and we will have others. We ought to think ahead and prepare for the evil day when it comes.
Therefore, since the world has still
Much good, but much less good than ill,
And while the sun and moon endure
Luck’s a chance, but trouble’s sure,
I’d face it as a wise man would,
And train for ill and not for good.
A recurring problem with open forums is the troll: a colorful creature who enjoys disrupting web communities. Trolls are powerful and plentiful. Given time, trolls will destroy any community on the Web. People don’t want to believe that their wonderful community would attract trolls, but sooner or laters trolls always appear. Trolls thrive on attention and will go to great lengths to get it.
Political sites are particularly vulnerable because they attract varieties of trolls who may leave other sites behind. Often, trolls are after attention: they thrive at stirring things up, at seeing lots of people arguing with them. This is usually simple neurosis, but naturally Republicans and other opponents would like nothing better than to see progressives spend their timing shouting at hecklers instead of…making progress. A very successful troll can blow up a Web community; naturally, opposing campaigns would like that.
Blue Mass is administered with wisdom and skill, but experience suggests that it’s best to prepare for trolls before they arrive. We won’t have time to think things through when the trolls arrive a week before the April 30 primary or the June 25 special election. By definition, trolls leap up at the worst possible moment.
The fundamental difficulty in managing trolls is asymmetry. The troll is an outsider. You, a member of a community (perhaps a founder or supporter or stakeholder) live there. The troll doesn’t really care what people think: they’re stoopid liberals and hippies. You, on the other hand, do care what the community thinks. Trolls can lie, cheat, and make stuff up; you can’t. Trolls can call you things your mother wouldn’t like; your mother reads this site, but the troll’s mother doesn’t.
Trolling is not new: in the early days of the left, trolling was an existential problem. You’d get a crowd together, and then some wingnut would sneak into the margins and start yelling and pretty soon the wingnut was the story. Hello, Haymarket. (The Bread and Roses strike is a textbook of effective trolling.)
A lot of the time, alas, the Right didn’t need to supply the wingnuts. We grow plenty of our own. But it’s good to remember that damaging sites like Blue Mass Group would be a good day’s work for the other side, and there are plenty of white- and gray-hat techniques they can use. (Remember that it’s only been a few years since GOP operatives disrupted a NH GOTV effort by systematically crashing the Dem’s phone system.)
How might we prepare for trolls when they arrive?
A simple approach to trolls is moderation. Set rules, publicize them, and let authorized readers decide whether or not a submission should be published. Moderation isn’t open and it isn’t democratic, and it can make it harder for new talent to find a role in the community. For large communities, moderation can become very expensive.
The Blue Mass Group standard for moderation is that:
The purpose of Blue Mass. Group is to develop ideas that will invigorate progressive leadership in Massachusetts and the nation.
to which end the site encourages “commentary typical of thoughtful discussion between acquaintances who may hold differing views on important issues.” These are good, but a troll could drive a pickup truck through them by insisting that he is progressive (though he disagrees with everyone else) and that he is thoughtful (because everyone else is an idiot). If isolated, the troll need only point to some isolated supporters. Incompetent trolls rely on sock puppets, but any political operative should be able to recruit a supporting voice or two.
Instead of full-bore moderation, syndication lets everyone post but give prominence to some featured posts selected by moderators. This works well for DailyKOS, and it’s clearly been working pretty well for Blue Mass Group. Ideologically, syndication is nearly as problematic as moderation.
One way to complicate the troll’s task is to remove their anonymity, either systematically (by preferring or requiring people to use real names) or selectively (by identifying specific trolls). Some of the most disruptive trolls may be deterred if their ill manners can be brought home to their friends and neighbors; identifying a troll removes some of the asymmetric advantage the troll enjoys. Political anonymity has a long and respectable tradition in the U.S. and we might not sacrifice it easily, but perhaps we should sacrifice it here.
Schemes that broadly distribute the power to upvote and feature good contributions (reddit) or downvote and hide bad ones (slashdot) appear to address the ideological objections to moderation and syndication. In small and new communities, this can work; in large communities, it appears prone to gamification and random outcomes. Frequently, these outcomes feature pictures of cats.
A very old approach to the troll is ostracism or reading a member out of meeting. At need, the community simply ejects the disruptive contributor. This often is cast as a punishment, but need not necessarily be one. Wikipedia has shown that quite modest bans — 24 or 36 hours — and a permanent record visible to the community can sometimes prove sufficient sanction to discourage disruptive behavior.
Finally, several recent approachs simply increase the friction that the troll experiences when being disruptive. Egullet, a culinary community, requires a modest membership fee; you can be anonymous to other members but the home office knows who you are because you paid them. StackOverflow, a programming forum, grants posting and moderation privileges in exchange for work on the site. Either way, a troll who wants a bunch of alternative personalities or sock puppets (or a crowd of pliable supporters) will have to spend time and effort to do so. At any rate, the community benefits from the cleanup work and the membership fees.
The Path Across The River
One clear lesson from the Obama campaign’s tech effort is the importance of preparing for trouble. That trouble might be external — someone might be trying cause mischief. That trouble might be internal — a disagreement flares into a flame war. The trouble might be arbitrary and inexplicable. When trolls do appear, it would be good to know how to help.