Increasingly, developers and publishers are hiring paid community managers to safeguard their online reputation and smash the bad-eggs. Many also encourage players to report abusive behaviour, which can then be followed up by admins - the dream being that communities regulate themselves.
Harris continues: "The community can help identify bad apples and good apples but ultimately the developer needs to wield the ban hammer when it is time. Back to the neighbourhood bar example: not everyone can be the Bouncer."
While the community can be used to identify toxic players, they can also be employed to punish them. Famously, MMO Roma Victor took a number of cheats and rule breakers and crucified their characters in game. Public shaming is also used in Runescape, where repeated Bot users are placed in stocks and put on trial in a special area called (deliciously) Botany Bay.
Other players can even turn up to the trial to taunt the accused and pelt them with fruit. However, some developers see this public recognition of cheating as a step backwards. "Giving notoriety and attention simply attracts more negative behaviour," explains Harris.
Problem is, with online games becoming larger and more free-form, it's getting tougher to work out what qualifies as 'negative behaviour'. Repeated abuse in a forum is easy enough to spot, but what about in-game bullying or crime? Is this why it took Riot nine attempts to finally ban IWillDominate? And what separates his 'harassment' from legitimate psychological warfare or creativity? Riot have actually employed their own team of psychologists to work out why people troll their game, and what they can do to identify and stop it.
EVE Online provides a great example of one player legally taking actions that would instantly incur a ban in most other games. Player 'Cally' had the idea of starting a bank in-game, which he used to offer start-up loans to Corporations and Miners who didn't have the time or inclination to grind the currency themselves.
After months of running the bank, Cally showed up one day, emptied all the money into his own wallet (about 790 ISK, or $170,000 in real, actual money) and spent it on a giant, near-indestructible ship. He even used a good chunk of the cash to put a bounty on his own head, before taunting his victims into seeking fruitless revenge.
Devious, sure, but no in-game rules were actually broken and CCP actually applauded his efforts before they realised he'd destabilised the entire EVE Online economy. In fact, you could argue that - before he ran off with the cash - Cally was actually doing CCP a service by trialling a variant of the Free-to-Play model (pay money to get better items quicker) inside the game's economy. He was a thief, but he was a visionary too.
Cally's antics, and CCP's response to it, show that developers - while keen to stamp out bad behaviour - love to champion creativity. Developers are now looking into gamification and virtual rewards as a way for communities to actively regulate themselves - more carrot, less stick. Good behaviour leads to rewards, which in turn leads to a better community... in theory at least.
Rewards will never replace the outright ban, but if the rewards outweigh the benefits of cheating or the fun of messing with the rules, it may lessen the need for admins to wield the hammer. Get players heavily invested in a game and they stand to lose more from a ban. Virtual currency, real money, reputation, stats; it stings much harder when you lose a virtual life that you've got such much invested in. IWillDominate is proof of that.
Want to know more about Free-to-Play games? Check out CVG's guide to the 30 best FREE games available right now.