>In my own opinion this was a long time coming, and Reddit has long since shown that the original hands-off model is woefully inadequate in the face of communities that are willing to expend the effort to argue continuously in bad faith, organize to influence and control opinion in other communities, and attack the platform itself in their campaigns for hateful speech. Just ask /r/BlackLadies if you think these users "stay in their containment areas". Hopefully Reddit is turning the page to better empower its communities to protect their users and keep hate off the platform.
All this will do is push these folks out to different platforms where they will organize brigading and bad faith participation in the dark. There are gobs of matrix/discord/irc rooms where people organize the manipulation of social media, and this move just removed visibility of it from reddit.
IMO it's partially a discoverability issue. Hate groups target young/vulnerable people, and start off with subtle/ironic hate and then groom people into more serious hate. If you read leaked chat transcripts, and even KKK manuals, they discuss this specifically as a tactic. By moving this stuff off of reddit, it makes it harder for those people to find it, because they'd have to go looking for it. It doesn't solve the problem of hate in society, but it helps.
The other thing is that reddit is inherently multi-topic. You can't host hate groups and then quarantine them from non-hate groups. So what you get is a bunch of people who showed up for the hate groups also commenting hateful things in the non-hate groups. This creates problems for people who wish to have a hate-free experience to talk about their gardening projects or video games.
People always say this, but it kind of seems like superstition to me — I haven't seen much evidence to support it. After the last wave of subreddit bans, it was "Well, obviously they'll all just go to Voat." But the corresponding Voat communities were anemic, and most died out, and now Voat itself isn't even around anymore.
Will some people move to an alternative venue? Almost certainly. Will most people do that, and will the alternative venue continue growing like the old one did? I'm pretty skeptical. The reason so many communities are on Reddit is because it's relatively easy to maintain and grow a community there. By denying them that tool, you make it harder for them to do that.
I'm sure most people here who are old enough can think of some old forum they used to love that isn't around anymore. When that one died out, it wasn't simply a matter of everyone going to another one and everything was the same. It works the same way for negative things as it does for positive ones.
>There are gobs of matrix/discord/irc rooms where people organize the manipulation of social media, and this move just removed visibility of it from reddit.
True but then it's not Reddit's problem. Reddit can't solve every ill, but it can cover its own ass.
So if we assume you are right then those users are going to behave like arseholes in either case, better they behave like arseholes off reddit is likely reddit's point of view - not least for PR/legal issues.
Let’s try the opposite argument then: let’s give these fringe groups more prominent placement in the Reddit ecosystem. Surely it’s better to give them more of a voice so we can know what they have to say, than letting them put in the effort to make their voices heard, right? Maybe we should donate money to these groups to help them congregate in one place so we can easier keep an eye on them too.
You don’t sell walkie talkies to Nazis hoping that they will talk themselves out of being Nazis. You take away their voice until the bad idea dies down. Now, if you want to argue that some of these things are good ideas, I am all ears on how you’d want to justify racism, sexism, and xenophobia as valuable to our society. Unless you can successfully do that, your point isn’t valid.
That's not what I'm saying. I'm saying that removing visibility into their motivations, by pushing the bulk of their activity in their insular communities offsite, allows them to more easily blend in and use less extreme language to push their disgusting viewpoints and manipulate conversations onsite.
Essentially it reduces the ability to detect bad actors, is what I'm saying.
No, I understand: it's the idea that keeping them in the spotlight means we can keep tabs on them easier. I can see the appeal of that solution, but here's the problem: some people will legitimately go to /r/The_Donald to figure out what these people are thinking. Researchers, political operatives, etc. might find this very convenient. But the other effect this has is that (a) it gives them legitimacy (look! One of the larger subreddits on Reddit is an alt right haven!), and (b) most people aren't critical thinkers. They will see a political meme that mildly aligns with their beliefs, chuckle, move on. Until they see another, and another. Eventually, they'll notice a pattern: these memes are coming from TD, so they subscribe. Next thing you know, they are only getting their "news" from the TD memes, and suddenly they live in a political thought bubble getting further indoctrinated. That's what will happen with 99.9% of the people who end up on a subreddit like The_Donald, whether you like it or not. So in my estimation, the benefit of keeping TD around (easy visibility into the thought process of these individuals) is greatly outweighed by the drawbacks (providing them with easy means of recruitment and indoctrination).
Also, please keep in mind that for fringe groups like these Reddit is not their primary form of communication. That is, the community leaders don't just PM each other on Reddit or communicate via posts and comments. If there is any kind of organized effort here to effect real world change, it is coordinated by private chat, often times with assistance of anonymizing networks like Tor. Reddit is where people come into this world, but it's only floor 1 of the a vast underground bunker (for the communities where these things exist). Take 4chan as another internet dumpster fire. Anonymous periodically puts on their own campaigns like pridefall which can absolutely create real world problems. But they don't communicate through memes. When stuff starts working on that level, they are coordinating these things via better channels and that's happening today. If 4chan was to disappear, I firmly believe that core group would still be in touch and looking to set up another dumpster fire on another social media site, be it a private Facebook group, a Twitter community, a Discord server, or something else. But I'd rather they spend their time chasing their tail trying to get back to having external visibility than to have them do it in the open.
Also, they already have the opportunity to act more covertly, and they do. I am getting pretty good at spotting bad actors, but I'm sure I miss more. Some are pretty clever in their online interactions. Kicking them off Reddit doesn't give them any better tools to do this. They already can and already are doing this today.
If you want further proof of this, consider why the members of the KKK wear hoods. Could they do what they do in the open? Maybe. But because they are your neighbors, your politicians, your police officers, and your friends, they need a level of anonymity. Take that away and you make it harder for them to do what they do. Reddit provides the means for that open yet semi-anonymous MO. Taking it away doesn't mean they change their beliefs, but it makes it harder for them to do what they do.
I'm not sure why you're getting downvotes, because this is spot-on. Anonymous paper-folding image boards and other places like that are already celebrating the arrival of newcomers. It's like sending a petty offender to prison, and instead of getting reformed, they just get a lot more hardcore.
I get it that Reddit doesn't want extremism on its site, and I don't have an opinion either way on the bans. But I think it's naive to assume that people won't just hop to another, possibly more extreme, platform.
You are missing the point. Organizing, coordinating, recruiting, all these activities have costs associated with them. Costs like time spent on them, time spent finding tools to use for the job, etc. Make a racist subreddit go away and the N people who were a part of it won’t stop being racist. But it will be harder for them to find another platform to recruit and to some degree harder to find ways to coordinate. Now time that could have been spent indoctrinating new recruits can is being spent on less productive tasks, slowing the whole community down. Will it stop racism or even the racist group from doing what they do? Of course not. Will it slow them down, yes absolutely.
If a bunch of assholes showed up to your front lawn and started doing asshole things, do you kick them out knowing that they might go to your neighbor’s lawn? Or do you keep them on yours to protect your neighbor? I guarantee you that you’ll kick them out and then likely help your neighbor kick them out too, not shelter them and help them do the asshole things by providing your lawn as a platform. How is this different?
All this will do is push these folks out to different platforms where they will organize brigading and bad faith participation in the dark. There are gobs of matrix/discord/irc rooms where people organize the manipulation of social media, and this move just removed visibility of it from reddit.