We are not in tech anymore
Hey! Someone got banned from Mastodon! Someone didn't get banned from Twitter! That's a huge problem! Why? We don't know why those decisions were taken, we see people complaining, and all we see are shadowy figures of authority taking decisions and saying, in essence, Trust Us.
This is a Big Problem. If you're conversant with the liberal arts and its studies on power, you know that when people and institutions have power without accountability or transparency, abuses occur.
Lord Acton and his famous dictum nailed it, dead on. The 20th C is full of those without transparency or accountability who viciously abused their power: a shortlist of a few of the most famous - Stalin, Mao, J Edgar Hoover.
If you're looking for an anti-hierarchical model, I suggest you very carefully read Jo Freeman's famous essay "Tyranny of Structurelessness".
In the Anglo-American model of juridical procedure, we have learned that evidence in public matters. We have learned that judges need to be recalled. We have learned that judges and juries err, and we have systems that address those errors - the term is recourse. You need recourse for injustice, so that you can have the injustice redressed. Justice is not justice when a single judge dislikes you and wants to make your problem go away.
This is where the internet has been stuck for decades in its online systems because we have believed that we're just hanging out, we can jump to a new program, a new system, no worries. That hasn't been true since MySpace blew up and brought us the mass social media age (arguably before, but let's put the pin there and not define social media).
I want the reader to really think about Jo Freeman's quote there. What we've done - in Mastodon and in open source communities in general, is to say, specifically, we are choosing to have the developers make decisions, and the rules of these decisions are resting soly upon the developers' whims.
Again from the Tyranny of Structurelessness:
Jo Freeman grasped in the early 70s what we struggle with today: the strong and the lucky - the developers and writers of software - have taken control under these concepts. But why does this matter? Isn't this just a means of socialization?
I argue that the essence of this business of online forums - the big ones - is that social media is a civil society forum - particularly the freewheeling approach of Twitter & its children where you can
Social solutions require structure, procedure, and accountability to handle this civic society we have found ourselves in. Traditionally, online communities use volunteer moderators, with perhaps a paid admin. That doesn't work.
But - it doesn't matter that much on a free & libre platform, one might say? Because, this is civic society. There are thousands of people on a single server. This isn't just an admin with 30 friends of his. It's not a bar where the owner can just toss a few people out. This is a city. A society. You need a public governance process with public recourse, not hordes of shadowy figures stabbing each other via the mod system. Mastodon.social - mastodon.cloud - octodon - these instances - all the big ones - need not to be subject to a moderator's unilateral private decisions.
I understand that requiring enormous masses of volunteer moderators is a problem. These people burn out - they have lives - they are not incentived to be unbiased outside of personal morality - they require enormous coordination effort. Personally, I've argued at length on Mastodon against the concept of decentralization of social media due to the misbehavior of bad actors, because of the problems of volunteers and inadequate coordination. Governance requires diligent people doing boring work - in other words, often you have to pay someone. A genuine justice system needs to be created.
What specific solutions might I propose to address this problem?
But, to be specifically helpful: A united servers of the fediverse - a USF. Equal rules on banning and report management. Similar to extradition treaties; they whitelist who gets in.
If you want to have a fair hearing on issues, you join a member server of the USF. The USF has a "treaty" agreeing that "felonies" and appropriate remedies will be handled as a matter of public record. Felonies can vary, but inter-server relations are defined.
Misbehavior by admins gets your server tossed out of the treaty organization - via a defined process.
Another, more abstract example: I would strongly suggest the Fediverse - or the mastodon crew - or the crew interested in good governance - dig through the well worn rules for voluntary organizations and work from there rather than figure out "how to resolve disagreement and toss misbehaving people/entities" from first principles.
These rules and these systems are always messy, and there are always bugs in these systems. But... let's not judge by arbitrary daily hates determining what will occur. We have to play politics. And, in an approximate sense, politics in the sense of individual actors coming to agreement in the face of conflict is a solved problem: it's done all the time.
The time worn book of committee work in the USA is Roberts Rules of Order. It's a text on parlimentary procedure. This is sufficient to handle vicious politics with almost intractable differences of opinion, but is, in turn, abusable by rules munchkins, and is very complex.
There are consensus systems which are designed to replace Robert's Rules; last time I looked through them, the preferred approach was consensus with up to 2 dissenters. More than 2 dissenters stopped the proposal. I'm not particularly keen on consensus systems, getting a bunch of independent cats to agree that much sounds like a nice footgun to effective work. Certainly they are popular in some left activist spaces, but it requires some facilitation to work.
An effective group decision system needs to accept sharp disagreements, but allow for adequate compromise to move forward. In any case, my argument is that some process is needed; the choice of process is a different question.
For on-instance moderation, I would suggest looking at "infraction/misdemeanor/felony" type classifications, with varying punishment and discretion levels. E.g., for judging a felony, a single admin can't just toss someone off the server unilaterally. That could be part of such an agreement.
To zoom out somewhat, looking forward to late 2018 and the 2020 USA election situation, we - the concerned citizens of the Mastodon-verse - are going to be dealing with enormously toxic organized social media groups, both informal, formal, paid and volunteer. The Mastodon-verse has to be building social processes to deal with that problem, today.
Then, of course, part of what we are dealing with today for specific personalities on the fediverse is weaponized "me and my personal army" situations. The Mastodon-verse has to tolerate that problem as well as the electoral campaign infowars. Personal army harassment is a signature issue in the Twitter world and its children.
There's a lot here to digest and pick apart, and I don't claim to have the right answers. I do claim several key points.
We're not in tech any more, we are in the humanities...