>It gets worse.
>Section 230 contains a “Good Samaritan” provision that protects intermediaries when they take measures to filter or block certain types of content. This section ensures that intermediaries are not punished for mistakes they might make in removing or failing to remove user-generated content. In other words, a service blocking some content does not make it liable for what it didn’t block.
>SESTA would compromise the Good Samaritan provision by imposing federal criminal liability on anyone who merely knows that sex trafficking advertisements are on their platform. Platforms would thus be discouraged from reviewing the content posted by their users, cancelling out the incentive to review, filter and remove provided by the Good Samaritan provision.
>That puts companies that run online content platforms in a difficult bind. Any attempt to enforce community conduct guidelines could be used as evidence that the company knew of trafficking taking place on its service. (As we mentioned above, federal criminal law already applies to intermediaries.) The two choices facing platforms would seem to be to put extremely restrictive measures in place compromising their users’ free speech and privacy, or to do nothing at all.
oh no, you mean that censoring Nazis might mean that twitter is liable for CP?
oh no
what a bad thing
how terrible