This could get *very* expensive

The insistance is on editorial powers as the reality of SM.

SM have taken that up and faded from it somewhat. But SM are over that threshold and now are expected to have editorial power.

The law has not caught up. The agenda’s have not fully formed other than at Twitter. Editorial powers mean an agenda.

I heard a great comparison on the radio this morning.

If a radio host says something illegal on air, not only can they be liable, so can the station. If it is printed in the newspaper, both the paper and the author can be liable. If it is said on tv, both the speaker and the tv station can be liable.

But if the same radio host, who is getting compensation from Youtube for subscribers says it online, Youtube is not liable. If Facebook recommends such to a visitor, it has no liability.

I think that is problematic. Perhaps if there was at least SOME liability, we would see a lot less nonsense online - and I don’t think the removal of that nonsense keeps anyone from finding DYI repair videos or chatting with the FB friends.

6 Likes

In fairness the pass was given to help the internet take flight early on.

I think all of those major dot coms are cruising at a pretty high altitude these days.

You would see a LOT LESS of everything online. In fact, you would only see stuff that is very popular. That’s because it is literally impossible to police anything else on a regular basis. If they have liability from posting something, they need to look (human, not machine, because machine apparently comes with same liability) at everything before it goes out. That means the top X00,000 stuff gets to go out while nothing else gets to go out.

2 Likes

That is true but ignores the scale of the problem. A newspaper or radio station deals with a couple dozen “voices” in a day. A radio talk show may have 8-10 callers in an hour, but they are on tape-delay so can be dumped if they are truly pernicious.

By contrast Facebook and YouTube are dealing with tens of thousands of “voices” every minute . There is simply no way to police all of that for a platform. (That doesn’t make it right, I’m just pointing out the practical realities.)

Using the Motley Fool as a very very small comparison, they have what, some dozens of people posting, and yet it would be hard to pre-censor everything that gets onto a board as newspapers and radio stations can. That’s why the telephone company was held harmless from speech that happened over their wires; they had no way to interfere with it even if they wanted to.

It’s a conundrum, but technological progress doesn’t always come in a tidy package.

6 Likes

Between automation and AI none of this is a problem.

I am reasonable.

Plato, in his The Republic, made a strong argument against most public arts and forums of communications, and I (and most of us) strongly disagree with him.

But his critique, when applied to “Social Media”, was prophetically accurate: soial media are

essentially useless with a pretense of importance and validity,
forums of lies and deceits, and have a
strong inherent tendency towards eliciting and amplifying immoral abuse.

david fb

4 Likes

The other side of this coin of SM.

The most vocal group in the US on SM are the African Americans. A far greater call for justice in America has come forward. I can fully applaud this. It is not possible without social media. The counter voices of violence against African Americans do not bother to smile as they kill no differently than any other day.

We have far more freedom because of the internet.

There is not just one SM to attain editorial powers. Instead there is a web of ideas and voices that give us an opportunity to find not a common ground but in our humanity a much greater honesty.

We need the guts to face something about using people. Whether it is pay, healthcare, education or retirement fundings through the public rules or purse we need honesty. As opposed to holding our nation back.

adding just a thought if you had to take the argument against computer screens a little further it is to the violence of video games. What is interesting in the real world our pristine wishes are crushed by our guns. The screens are not at issue with our national madness.

While the issue is indeed more complex, nearly ever other major economy has found a way to address it in some way - so perhaps we are making a bigger deal of it than it is.

Of particular note:

Other countries use a notice-and-takedown approach similar to the Digital Millennium Copyright Act (DMCA). Once again, this gives online services some immunity, but can lead to removal of content may be controversial but not harmful or illegal.

Congress could rewrite 230 to require such companies to remove offending material in a timely manner - or face liability.

Other countries don’t have constitutional rights like we have.

Who decides what is offensive?

There are no constitutional rights at issue here. You have no right from censorship from a private company and if not for Section 230, the internet would be treated the exact same way as all other media outlets - in a Constitutional manner.

Section 230 is statutory law and can be repealed or amended (or perhaps found unConstitutional) at any time. I will say again, you have no Constitutional right to post anything anywhere online.

If it appears to violates current law - such as libel or slander - and when a complaint about such is submitted, sites would be required to review and pull offending content in much the same way they already do here at TMF for copyright violations.

This will never be an exact science and it is entirely possible, even probable, that sometimes sites will err on the side of caution and pull something down that they should not have - but again, you have no Constructional right to post on Facebook or twitter (or TMF) in the first place. Heck, the chief twit arbitrary bans people all the time from twitter when they say something he does not like and those people are not making a Constitutional claim that their rights were violated. If you feel strongly about your opinion, you are always welcome to start your own website/blog and share your mind. Neither the Washington Post nor Facebook have any obligation to host your content.

6 Likes

Some other-country govts promote outright lies and deceit. They will never take down their web sites/pages–so long as they promote what that govt chooses. Thus, we face two problems: Hosting (which is where the material is posted online) and promotion–which is what all the various search engines do. Promotion is also done by individuals on their own web sites, blogs, and so on.

There’s an analogous standard in place for custodians/sellers of physical written material here in the U.S. as well, like bookstores or libraries. A bookstore isn’t initially considered the “publisher” of defamatory material. They’re just selling books, and they don’t (and aren’t expected to) have knowledge of everything in every book that might be defamatory. So they are free from liability at first. However, if the existence of defamatory material is established (ie. a court ruling against the work) and brought to their attention, they have to remove the books else face liability.

But you can see the devil lurking in the details - establishing whether a work is defamatory, rather than (say) a copyright infringement. Sure, there’s lots of arguments and nuance in edge cases about what is “fair use” or not - but for the most part, one can (typically) determine whether a piece of work posted on a website violates copyright or not by examining the work itself. Does it quote/excerpt/display copyrighted material? How is that material used? Etc.

Defamation and other types of liability are very different. Suppose I write the sentence, “No one should visit Albaby’s Pizza Factorium because Albaby is a cheater and his food is unhealthy,” there’s no way to tell without investigation whether that statement is or is not libelous, because the defenses to libel for that kind of statement can’t be ascertained just by looking at the statement. So when Albaby sends his Digital Antemillennium Lible Act (DALA) takedown request, how does Facebook know whether to take it down? The sentence might be true. Albaby might be a public figure, or a limited public figure. Libel and slander laws are (typically) established by state and not federal law, so whether the statement is defamatory may defend on where the speaker or Albaby are physically located. Etc.

Albaby

3 Likes

Again, not claiming that there is a perfect solution but that there are in fact many examples where other countries and other industries (as you illustrate with the book store) that don’t have Section 230 protections and they have found a way to cope and still remain in business.

John Kosseff, author of “The Twenty-Six Words That Created the Internet,” is probably more knowledgeable about this subject than any other person (his book is about Section 230) is in favor of amending 230. I share his opinion.

Kosseff: So, if I went on Facebook and defamed you, you couldn’t sue Facebook, but you would be able to sue me. Section 230 doesn’t prevent you from showing me defamation… We’ve had some courts say even in that situation, Section 230 would prevent a platform from being required to remove that content that’s adjudicated to be defamatory. I don’t think that should be the case… I think that Section 230 shouldn’t protect a platform from having to take it down.*

Full disclosure, in the above interview, he also goes in to great detail about the risks of completely eliminating Section 230.

Personally, I would take it one step further and apply some sort of DALA/DMCA standard. Again, people have stuff pulled from TMF all the time because someone complained for various reasons. No reason why that practice can’t be enshrined into law. The practice certainly doesn’t keep respectful dissenting opinions from being posted here. I think there should be a different standard than what we apply to other media but there are certainly improvements that can be and should be made.

  • [full quote edited to reduce the risk of copyright enfringement! :upside_down_face:]
2 Likes

Of course. But they “cope” mostly by being almost impassible gatekeepers - by having a infinitesimal fraction of the speakers have access to their platforms than even a mid-size internet entity (let alone Google or Facebook). Very few people can get their content offered for sale in a Barnes and Noble or broadcast on an ABC-owned channel, compared to the access offered to millions of people by YouTube or Twitter.

It’s probably an easy - but relatively trivial - change to Section 230 to allow a court to order that content that has been adjudicated to be defamatory be removed from a website. After all, we do that with bookstores and libraries already. But that’s generally not going to solve any of the problems that people associate with the internet.

As pointed out in the article you quoted, Section 230 makes things better, not worse, on social media. Without Section 230, the only way for companies to avoid liability is to retreat into the “mere conduit” protections of existing defamation law - to not try to moderate anything that isn’t illegal. Every website would then have to choose between eliminating all user-generated content (unless pre-cleared), or allowing all user-generated content.

Neither gets us to a “better” version of Facebook or Yelp or Instagram or what have you. An internet where nobody gets to post anything unless they and/or their content has been pre-vetted will not look anything like the current internet (no more Fool discussion boards, or any free discussion boards or comment sections, any more). An internet where everybody gets to post anything because there is no more content moderation won’t look like the current internet either, because there are awful people.

You can’t have both. You can’t have an internet where platforms are liable for defamatory statements made by users (not just have to take them down when they are ordered to) and an internet with non-trivial amounts of user-generated content.

3 Likes

The EU would appear to disagree.

The Digital Services Act (DSA) entered into force on November 16, 2022. This new European regulation builds on the Electronic Commerce Directive to strengthen the moderation obligations of online platforms regarding illegal content, such as racism, child pornography, counterfeiting and disinformation. Among various obligations, online platforms must remove illegal content as soon as they become aware of it and comply with new transparency obligations to avoid heavy fines. For more insights, please refer to our journal article, The Proposed Digital Services Act Package – What You Need To Know.


Not really. Their rules are simply that platforms have to remove illegal content as soon as they become aware of it. That’s already pretty much the case in the U.S. Content that is illegal under federal law (like child pornography or copyright infringing material, to take two wildly disparate examples) is already subject to takedown requirements even for internet companies. Heck, even a local bookstore would have to stop selling actually illegal content.

That’s not the same as saying platforms are liable for non-criminal but defamatory content.

Because the EU doesn’t have the equivalent of our First Amendment, there’s just less “awful but lawful” stuff. They’re making a lot of the awful stuff illegal. You can get takedown orders against wide array of content. You can’t do that in the U.S. Some content that’s being made “illegal” under this new act (such as racism and disinformation) literally can’t be made illegal in the U.S. So that moderation function can’t be outsourced to a court or regulatory agency that can issue takedown orders. The stuff only gets removed by the platform if the platform applies its content moderation policies…and thus can’t retreat to the “mere conduit” safe harbor that might be available to them in the EU.

3 Likes

What do you mean by “illegal”?
Bill Maher once said that Trump’s Mother was an orange orangutan. Donald Trump tried to sue him. No court would hear the case.

How do you define a statement as “illegal”?

1 Like

Against the law?!? How else does one define illegal?

Criminal is illegal.

I am not a lawyer but civil cases are not about what is illegal.

1 Like