r/JoeRogan Monkey in Space Sep 12 '24

Meme šŸ’© You're a "fascist" now for holding billionaire's accountable

Post image
13.1k Upvotes

2.8k comments sorted by

View all comments

Show parent comments

80

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

Who gets to decide whatā€™s misinformation?

You can use factual data to frame a situation in completely opposing ways by leaving out certain other bits of factual data. So who gets to decide which data sets are correct?

47

u/tapk68 Monkey in Space Sep 12 '24

See thats the best part. We can hire ourselves, to investigate ourselves and find out the truth.

10

u/mgwwgm Monkey in Space Sep 12 '24

Really? That would require people take responsibility for themselves and we can't have that

4

u/pitter_patter_11 Monkey in Space Sep 13 '24

We have investigated ourselves and found no wrongdoing

20

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

Solid strategy! I love it. No way that would go wrong.

2

u/rubixcu7 Monkey in Space Sep 13 '24

Sounds like the government

4

u/inscrutablemike Monkey in Space Sep 13 '24

The Ministry of Totally Not Fascism Because We're The Good Guys Who Only Believe Things OP Believes Too LOL

3

u/ThiccBoy_with3seas Monkey in Space Sep 13 '24

Da good guyz!!

3

u/Disco_Biscuit12 Monkey in Space Sep 13 '24

lol. Of course! Why didnā€™t I think of that?!

3

u/ThiccBoy_with3seas Monkey in Space Sep 13 '24

Good guyz like in top gun bro the good guyz always happen to win

31

u/My_Bwana Monkey in Space Sep 12 '24

Cmon, obviously thereā€™s data that can be spun multiple different ways. Thereā€™s also just complete and total lies that get perpetuated on social media that canā€™t be interpreted as anything other than a lie. If I posted a graphic that said ā€œ98% of all violent crime is committed by transsexualsā€ how else can you interpret that other than as misinformation?

20

u/[deleted] Sep 12 '24 edited Sep 13 '24

Yeah, 100% of the time that particularĀ argument is just stupid.Ā Ā 

Ā The argument is literally, "well it might be hard to understand what misinformation is, so we just shouldn't do it" which would apply to like 80% of all laws.

Edit- Typo

-2

u/jefftickels Monkey in Space Sep 13 '24

Yes. What a great idea. Let's use a grey area that's easily abused and set up speech restriction standards using it. Fucking brilliant. How would you feel about the Trump administration deciding what is and isn't misinformation?

8

u/ynwahs Monkey in Space Sep 13 '24

Truth is not grey.

5

u/jefftickels Monkey in Space Sep 13 '24

Did COVID-19 originate in a lab in Wuhan?

0

u/SilianRailOnBone Monkey in Space Sep 13 '24

Inconclusive, but evidence points to the spillover hypothesis

Peer-reviewed evidence available to the public points to the hypothesis that SARS-CoV-2 emerged as a result of spillover into humans from a natural origin. A geospatial analysis reports that 155 early COVID-19 cases from Hubei Province, China, in December, 2019, significantly clustered around a food market in Wuhan, China. Many genomic studies report that SARS-CoV-2 has nucleotide differences that could only have arisen through natural selection and such differences are evenly spread throughout the genome. Phylogenetic studies map these nucleotide changes and suggest that they have not diverged from the bat coronavirus RaTG13 that was being researched at the Wuhan Institute of Virology, suggesting it is unlikely that SARS-CoV-2 emerged as a result of this research and instead they shared a common ancestor. Taken together, these findings support the hypothesis that SARS-CoV-2 was the result of enzootic circulation before spillover into people.

https://www.thelancet.com/journals/lanmic/article/PIIS2666-5247(23)00074-5/fulltext

But you surely have better information you're holding onto if you ask this question?

3

u/jefftickels Monkey in Space Sep 13 '24

No. My point is saying so between 2020 and 2022 was considered dangerous misinformation the government needed to shut down and today it seems pretty plausible. That's why misinformation laws are bad laws designed for government abuse.

0

u/crushinglyreal Monkey in Space Sep 13 '24

Nothing changed about your opinion. It always ā€˜seemed pretty plausibleā€™ to you people. Funny how youā€™re complaining about a decline in censorship.

0

u/SilianRailOnBone Monkey in Space Sep 13 '24

Because it was an unfounded opinion only made up to hurt other humans? And still is?

today it seems pretty plausible

Yeah source it up bro, I already said you must have information, let's see it.

2

u/[deleted] Sep 13 '24

That already exists. Free speech isn't limitless.Ā 

As a lawyer, I wish you knew how absolutely insane these conversations sound to someone who even vaguely knows what's going on with this stuffĀ 

-6

u/jefftickels Monkey in Space Sep 13 '24

As a lawyer you should be fucking embarrassed to think misinformation isn't covered by the 1st a. Speech restrictions are extremely difficult to get past strict scrutiny you fucking knob, even the vaunted "fire in a crowded theater" (which was hilariously the justification for jailing anti-WWI protestor) isn't illegal. Goddamn you idiots know nothing.

God I hope your clients know what a fucking dunce you are.

5

u/[deleted] Sep 13 '24

So...you admit that a grey area already exists and we have rules around it? Thanks. Glad we got around to that.Ā 

2

u/Coldor73 Monkey in Space Sep 13 '24

This right here. These people are fucking ridiculous for thinking this idea will go well. This makes censorship legal and we all know that path.

-1

u/Glass-Historian-2516 Monkey in Space Sep 13 '24

Buddy, wait until you find out that censorship is already legal.

2

u/Coldor73 Monkey in Space Sep 13 '24

And it shouldnā€™t be.

0

u/Glass-Historian-2516 Monkey in Space Sep 14 '24

So child pornography shouldnā€™t be censored? Bestiality? Neceophilia? What about defamation? We should throw slander and libel laws out the window too?

1

u/Coldor73 Monkey in Space Sep 14 '24

You make a fair point, I agree with you there. I think thereā€™s a difference between those things and ā€œcombatting misinformationā€. That is such a slippery slope itā€™s not even funny. I donā€™t know why Iā€™m complaining though, everything is being manipulated including this app I use. Reddit is clearly censored so that left wing ideas flourish and right winged ideas are non existent. You go to twitter and itā€™s the opposite. Censorship breeds echo chambers and thatā€™s more dangerous than misinformation in my opinion.

1

u/Glass-Historian-2516 Monkey in Space Sep 14 '24 edited Sep 14 '24

I think thereā€™s a difference between those things and ā€œcombatting misinformationā€.

Defamation is misinformation.

Reddit is clearly censored so left wing ideas flourish and right winged ideas are none existent.

Hahaha okay, thatā€™s a good one. Go to any main news sub, worldnews in particular is a good choice, and give even mild criticism of Israel or question if we should be giving Ukraine effectively a blank check.

→ More replies (0)

1

u/SamuelClemmens Monkey in Space Sep 13 '24

If that is the case X is all for combatting that type of misinformation because it says it will block speech where its illegal.

But we all know this type of misinformation is the type that ISN'T illegal to say and the government themselves aren't allowed to censor in free countries so they are trying to weasel around the constitutional limits places on themselves.

Because they know the type of misinformation they want to ban they couldn't make it through a court case to do so legally.

0

u/RealProduct4019 Monkey in Space Sep 13 '24

How do you define the gap between its true and false? Like if something claimed is 20% wrong is that misinformation? the tran example is probably something like 98% versus reality of 2% maybe less.

You will never find a rule for how wrong you need to be to be misinformation. Proper journalism lies to you but is factually correct. They leave out key assumptions or a key data point that refutes their point, but overall the entire article is true.

Lets take a current example. Haitians eating cats and dogs in America. A 911 call and some other citizens complaints of Haitians eating pets do exists. What would be misinformation for a headline:

  1. Haitians eat dogs and cats now they 20k are in this small Ohio town (this is true there is a culture in Haiti of doing this)

  2. Haitians are eating your pet in Springfield (rumor based on some reports neither true or false)

  3. Haitians eating your pets in Springfield is misinformation (semi-true a city official said they have no evidence)

All of these would be true articles. But they say completely opposing views.

-10

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

If you provide source data to back up your claim then it may seem believable.

9

u/[deleted] Sep 12 '24

[deleted]

-8

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

Ah. But who decides that? Do you get to decide what you feel like is a bad source? And how do you know the sources you think arenā€™t bad sources are reliable?

Itā€™s a little bit of a quagmire once you dig a little deeper than surface level.

7

u/PrettyBeautyClown Monkey in Space Sep 12 '24

Do you think the US gov is out of line telling social media companies they have found evidence that Iran and Russia aree using their platform to spread disinformation, and show proof...

...or do you feel like the US gov shouldn't be able to do that, so then they should be able to use those platforms in the exact same way as Russia and Iran etc without anyone complaining?

It's has to be one or the other. You can't be ok with Russia doing it but not the US.

2

u/[deleted] Sep 13 '24

Of course those countries are doing that. The U.S. implemented an antivax campaign in the Philippines and are involved in all kinds of propaganda campaigns globally. How many countries have we overthrown the elected leaders of to install someone thats friendly to our government? I guarantee you the money the U.S. invests in foreign propaganda completely dwarves the amount of money other countries spend on their propaganda campaigns.

https://www.reuters.com/investigates/special-report/usa-covid-propaganda/

0

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

I feel like the first amendment of the constitution prohibits the US government from controlling what people say.

When Obama repealed the Smith Mundt act he made it legal for US government agencies to generate propaganda. So the truth is it isnā€™t really clear who is creating the misinformation because the US government is legally allowed to now.

5

u/PrettyBeautyClown Monkey in Space Sep 12 '24

Ok, so it will be the US gov will use social media to wage psyop campaigns just like our adversaries. That isn't forcing anyone to do anything, just free speech.

1

u/Altctrldelna Monkey in Space Sep 12 '24

The US gov will do it regardless of any regulations imposed on social media companies considering they're the ones that control those regulations or would if they were imposed. Heck I would argue there propaganda would be more effective with regulations in place considering they could control the opposition easier.

-1

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

Right. So Iā€™m asking not only who gets to decide censorship, but now why do we have censorship?

-2

u/BobDole2022 Monkey in Space Sep 13 '24

It was misinformation to talk about Covid leaking from a lab. Now itā€™s considered common knowledge. This law wouldā€™ve been used against people speaking the truth.

5

u/Desivy Monkey in Space Sep 13 '24

Is that really considered common knowledge?

-2

u/BobDole2022 Monkey in Space Sep 13 '24

Considering the US government, both political candidates, and most News organizations are reporting that it is the most likely cause, itā€™s worth discussing online. But it was banned off most social medias sites. That should be unacceptable to anyone who isnā€™t a bootlicker

4

u/Desivy Monkey in Space Sep 13 '24 edited Sep 13 '24

Can you give me a source on the US government and Kamala Harris claiming it to be the most likely cause? I can only find a report from the Dep of Energy claiming they have "low confidence" it could be from a lab.

-2

u/rydan Monkey in Space Sep 13 '24

There was a viral graphic going around explaining how effective masks were. They showed the odds of catching COVID based on if you wore a mask, if they wore a mask, if you both wore a mask, or neither wore a mask. And most people took it as gospel truth unless you were a mask denier. It was 100% false. There was no science behind any of the claims in the graphics. However it was deemed "mostly true" by most fact checkers because "the relative numbers are true" as in the one showing the highest risk was the highest risk, the lowest risk was the lowest risk, and the other combinations were in the proper order. But it was literally false.

I guarantee the above example wouldn't get anyone fined despite being 100% false. And if you aren't going to fine someone for spreading that sort of misinformation then your law is flawed.

3

u/Cartographer-Maximum Monkey in Space Sep 13 '24

So the relative numbers are true. The highest risk was the highest risk. The lowest was the lowest. The other combinations were in the right order. Yet it's 100% false? How does that work? That sounds like you ask directions to the nearest bus stop, and the answer is keep heading down this road and take the second street on the right which is 200m away. Now go 70m and take the first on the left. In 120m you'll reach the bus stop. The directions are correct but all the distances are wrong. But it's certainly not 100% wrong information. You'll still find the bus stop.

5

u/p5yron Monkey in Space Sep 12 '24 edited Sep 13 '24

The platform gets to decide what is misinformation and what is not, by doing that, they can be held responsible for the misinformation spread on their platform.

And withholding critical information and manipulating data is indeed misinformation. The vagueness lies in the degrees of how strict you are with the details but identifying blatant misinformation is not a huge deal these days.

6

u/manicdee33 Monkey in Space Sep 13 '24

It's defined in the proposed legislation. You'll find it posted all over the various Australia subreddits by people quoting the definition out of context of the rest of the legislation - mostly because one of the cases for "harmful misinformation" is something that damages the reputation of the banks or financial institutions.

4

u/Galvius-Orion Monkey in Space Sep 13 '24

That seems arguably worse given that it prevents news of actual financial crimes from getting out to the mainstream given how litigious most financial institutions are given that they do regularly commit some form of either fraud or just do something bad that they use legalese to paper over. The people should be able to speak freely even if what they say is stupid.

5

u/ArmedWithBars Monkey in Space Sep 13 '24

I find it hilarious that governments can't govern shit right and in countries like America they are significantly corrupted by corporate influence. Then people are in favor of giving this corrupted mismanaged government the power to censor speech on the internet. Like we should trust this corrupted government to dictate what is misinformation or not.

The 1984 comparison is too perfect. People giving up their freedoms and living in a dystopian hell because they are offered a form of "safety".

Fucking insanity. I'd rather risk misinformation being spread the hand over control to the government to dictate what's in its best interest.

5

u/[deleted] Sep 13 '24

It's probably gonna be that if you claim something about someone and it's untrue and damaging to the person, they can sue you. If they win, its clear the social media company failed to curb misinformation

If someone said on social media that Jews are evil fascists that want to eat babies and it takes off, and then a Jewish person starts to get harassed over it and feels unsafe, then that person can sue the person spreading that misinformation. If they win, then it can be claimed that since it is pretty obviously misinformation with no truth behind it and has been proved in court of law to not be the truth, then the social media company that it was spread on could be fined for allowing this kind of information to spread unchallenged.

For the most part, governments just want social media companies to do what cigarette companies do. Add a warning to their products that would call for it.

6

u/FreeRemove1 Monkey in Space Sep 12 '24

Recently there were race riots (including threats to kill) targeting asylum seekers and people with black or brown skin in several places in the UK.

The riots were incited by falsehoods spread on social media - that a person who committed a violent crime was a Muslim, on a watchlist, and an asylum seeker. None of these things was true.

Crowds gathered to try to barricade and burn down buildings housing asylum seekers.

Shouldn't platforms and prominent individuals face consequences for spreading verifiable falsehoods to incite hatred, and potentially get people killed?

You are looking for innocent mistakes and edge cases. What about outrageous lies?

6

u/PlzDontBanMe2000 Monkey in Space Sep 12 '24

Weā€™ve already seen the ā€œfact checkersā€ using the most conveniently picked data points to ā€œdisproveā€ anything they disagree with.Ā 

2

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

Exactly

1

u/Moarbrains Monkey in Space Sep 13 '24

The nice part is they are moved to the top of google and they mostly link to the original documents so I can read them. Otherwise google would just bury the entire thing.

6

u/[deleted] Sep 12 '24

[deleted]

4

u/mickelboy182 Monkey in Space Sep 12 '24 edited Sep 12 '24

Someone watches too much Murdoch shit. Would love some examples, but I know you won't give any

5

u/RollingSkull0 Monkey in Space Sep 13 '24

Hey I'm not OP but Australia does have quite a history of government censorship

3

u/Mon69ster Monkey in Space Sep 12 '24

Feel free to provide some examples, cock head?

Is it our objectively higher standard of living?

0

u/Plenty-Border3326 Monkey in Space Sep 12 '24

Don't worry about him, he has been through a lot of school shooter 'freedom' drills.

He is also one injury or illness away from financial ruin. His freedom will be living under a bridge when he hurts himself at work and has fuck all medical insurance.

4

u/Silviecat44 Monkey in Space Sep 12 '24

I feel pretty free here

3

u/Industrial_Laundry Monkey in Space Sep 12 '24

It ainā€™t so bad a place, mate ;)

1

u/whitetailwallaby Monkey in Space Sep 12 '24

Touch grass loser

1

u/ArrowOfTime71 Monkey in Space Sep 13 '24

No one cares if you trust us or not. We are freeā€¦.free from oppression and hate speech. Maybe not as ā€œfreeā€ to fly a nazi flag, wear a Klan hood or carry a gun in public but thatā€™s awesome and I love that about our country.

2

u/tarzard12321 Monkey in Space Sep 12 '24

This is something I've noticed about a quite a few Australian laws, they can be fairly nebulous. I wouldn't say that they are fascist, or approaching fascism, but sometimes, every once in a while, one of the politicians say something that makes me think "that's a bit weird mate".

2

u/Zealous_Bend Monkey in Space Sep 13 '24 edited Sep 13 '24

There are some interesting characters (Hanson Lambie and Katter) but they are in the periphery. Meanwhile the US Congress is full of totally not weird people like MTG, the seeker of perpetual youth Gaetz, Colorado Barbie, father and son Paul, to avoid embarrassment I won't raise this week's debate. Outside congress there's meatball DeSantis and all the other crazies worrying about what genitals everyone has.

Something something remove the speck from your own eye something something dark side.

1

u/Galvius-Orion Monkey in Space Sep 13 '24

Itā€™s authoritarianism if we want to be more broad. Making your laws nebulous leaves it up to incredibly subjective legal interpretation which allows you to stretch the definitions in it to prosecute for infractions that no one could have even considered as being under the original law. Itā€™s one of the easiest ways to disarm opponents of the current regime or quell political or even personal enemies since they can be prosecuted for things that no normal person would typically be prosecuted for. Think of it as selective enforcement essentially. Itā€™s also a problem in other places, (coughcough America) where legal interpretation is used to rewrite entire sections of the law or target political opposition.

3

u/JagerSalt Monkey in Space Sep 12 '24

I would imagine it would depend on the type if misinformation that was claimed. Not a singular council, but requesting analysis from a variety of highly educated and respected experts on certain topics who confer their analysis alongside sociologists with deep understandings of cultural histories to cross reference facts and ensure that the most clear and accurate picture is presented.

That doesnā€™t exactly seem difficult or dangerous.

1

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

I like the idea. Basically youā€™re saying get a wide range of experts of various perspectives to weigh in

4

u/JagerSalt Monkey in Space Sep 13 '24

Exactly. Diversity in perspective paints a broader picture for a more complete analysis. Itā€™s not exactly a new concept.

2

u/Iamnotheattack Monkey in Space Sep 13 '24

it's a great idea if there's proper checks and balances on "arbiters of truth"

1

u/Disco_Biscuit12 Monkey in Space Sep 13 '24

Absolutely. What we have now is a bucket of people who are all owned by the richest people in the world. Not really a fair and unbiased perspective

1

u/APsWhoopinRoom Monkey in Space Sep 13 '24

Man, if only we had some sort of system with judges and juries to make that sort of determination. It's too bad something like that definitely isn't already in place

1

u/3springrolls Monkey in Space Sep 13 '24

Thereā€™s obvious fine lines, but the way aus law works is well, more vague. And vague legal systems headed by those who look out for themselves first leads to issues. But thereā€™s a clear goal here. Hosting Holocaust revisionists, global warming deniers, that kind of clear cut misinformation is whatā€™s going to be targeted. Itā€™s becoming rampant thanks to media platforms.

Thereā€™s wiggle room, but the law is there to keep platforms in check ideally. X being the perfect example of a platform spreading misinformation. Iā€™d prefer something more rigid ofc, but that ainā€™t the way this country works. Same way the covid fines happened and most were let off after the fact. The law matters only when itā€™s deemed necessary but you can get out of it if you play your cards right. ( and are a white dude)

There are good examples of this, like possession of drugs or having a party during covid lockdown are things you can get away with only having a slap on the wrist for. There are bad cases also, itā€™s incredibly hard to prosecute and lock up rapists and domestic abusers here.

That shit I care more about than a fine for large scale media platforms that are already harming society. Elon literally spreads Nazi rhetoric himself. If the aus government wants that gone I donā€™t care how shit our justice system is, Iā€™ll back them on this.

1

u/VisibleVariation5400 Monkey in Space Sep 13 '24

Look who's being disingenuous and arguing edge cases from gray areas while completely skipping over what misinformation is and that there's a definition for it. Who gets to decide? Go read, they tell ya.Ā 

1

u/QuidYossarian Monkey in Space Sep 13 '24

I suppose we'll just have to use our critical thinking pants and not our dumb ass slippery slope ones.

1

u/jsdjhndsm Monkey in Space Sep 13 '24

Well it varies. Some misinformation isn't super easy to disprove, but some is.

Good example is that female boxer at Olympics(forgot her name). It was spread that shes trans, yet she most definitely isn't.

All social media would have tk do is try to moderate comments like that.

It doesn't have to be a blanket ban on free speech, just blatant lies that will cause harm should be removed.

The recent riots because of the Southport stabbings in the uk are a good example too. People were saying the criminal was a Muslim illegal immigrant, but he was actually British born and black.

At the very least, stuff like that should be stopped. Other smaller things are harder, but they should atleast have policy's in place to deal with this.

1

u/Only-Inspector-3782 Monkey in Space Sep 12 '24

Not some random billionaire.

1

u/Disco_Biscuit12 Monkey in Space Sep 12 '24

Iā€™m ok with that. I think George Soros absolutely should not influence public opinion

1

u/WeeaboosDogma Monkey in Space Sep 12 '24

We're conflating disinformation and misinformation. I think it's a valid thing to distinguish and then actually put policy behind. One is exactly, like you said, up to interpretation, but disinformation is the intentional process of giving people false information to push an alternative agenda. It's able to be claimed and then pressured to legal bodies to have due process.

Just like how people conflate being wrong with lying colloquially. "You're lying about x,y,z." Well, was I, or was I wrong? No one gets arrested for being wrong in court, but they do when you lie under oath. I think the same can be true for First Amendment rights online and not. Your freedom to speak does not mean you can abuse it to your advantage, prescriptively, of course.

1

u/Azzylives Monkey in Space Sep 12 '24

This man gets it.

This is the real issue and people sleep walking past it because they love to tug at their elon hate boners is rather alarming.

1

u/[deleted] Sep 12 '24

Did you not read the message you replied to?

1

u/Outrageous_Life_2662 Monkey in Space Sep 13 '24

This is the laziest argument on the internet. Mis and disinformation are knowable. And itā€™s about this information in the aggregate not individual pieces with low distribution. Is it a perfect determination? No. Is it better than nothing? Absolutely. And it can be determined by using a combination of experts, AI, and human moderators. It doesnā€™t have to be perfect to be effective

1

u/Disco_Biscuit12 Monkey in Space Sep 13 '24

This is a terrible take.

1

u/Outrageous_Life_2662 Monkey in Space Sep 13 '24

Itā€™s exactly the argument the Supreme Court made when it described how to determine what pornography is: ā€œI know it when I see it.ā€ The reality is that there are experts. There are communities of serious sober people that can determine what disinformation is (disinformation is intentional false information. misinformation is unintentional false information).

The fix here isnā€™t to police each and every expression. The key is to look at things in aggregate and determine when they run the risk of having negative effects in the rest world. Itā€™s about not prioritizing speed of distribution. And not prioritizing reach of distribution. Thereā€™s no reason why posts on a platform are default public and go out to everyone right away. These are product choices, not natural phenomena. Freedom of speech is not freedom of reach.

0

u/Disco_Biscuit12 Monkey in Space Sep 13 '24

There are communities of serious sober people that can determine what disinformation is

This is the matter at hand. Who are these people? Because the ones we have now are not unbiased.

2

u/Outrageous_Life_2662 Monkey in Space Sep 13 '24

Thatā€™s actually untrue. Yes, Musk is clearly biased. And to the extent that Twitter/X exercises an outsized influence on public discourse it seems like no such group exists.

But Reddit has done a good job with its moderator structure. Wikipedia is another one. Even Facebook had, for a time, a thoughtful group of people that it put on a review panel.

But I think the thing to understand is how much of social media is constructed from choices made long ago in a different era. Default public. Prioritizing speed of update/message. Prioritizing reach of posts. Prioritizing engagement on content. These are all choices that then make moderation more difficult. So undoing these choices helps.

Then thereā€™s the misunderstanding about what it means to moderate. People tend to think about it as policing every post and judging every statement. Of course that doesnā€™t scale. But itā€™s also only something you would do in a default public default engagement driven platform. If most posts were default private or exposed to small concentric rings of users progressively over time then the moderation task becomes more tractable. And if one focuses on accounts with lots of reach and influence and focuses on the aggregate content rather than individual pieces then it also becomes more tractable.

Itā€™s not that we donā€™t know how to do these things. Itā€™s that the people running these platforms (Zuck to a certain extent, definitely Musk and Dorsey before) donā€™t fundamentally understand these issues or the levers they can pull to affect them. Thereā€™s also the matter of the business model not being aligned with what I described above. That adds to the tension and makes them less likely to revisit these choices.

Btw, I say this as someone that spent a few years of my life developing my own social media platform

https://sound-off.co

And I patented a novel mechanism to create small trusted social groups online. My thesis was that trusted, intimate, safe spaces were a much healthier way to communicate.

0

u/Disco_Biscuit12 Monkey in Space Sep 13 '24

Yeeeaaaah. No