It'll be a lot trickier running any kind of U.S.-based social network under these circumstances, let alone whether it's sex- or relationship-oriented. After all, any general-purpose groups communication, advertisement or messaging system may be used for the kind of communication targeted by this legislation, which previously they were not liable for.
Any good moderator also knows that matters of user conduct are seldom black and white - indeed, it is the cases where you try to make the right call but it goes wrong which are likely to be the most troublesome; if you have suspicions but allow activity to continue, that could be viewed as facilitation. It's almost better not to moderate at all. Users also seldom welcome intrusion into their private messages, let alone revealing personal real-life information to staff to identify a bona fide relationship.
I suspect this is, in part, aimed at those services which have sought to use end-to-end encryption. If you provide a service which you can't decrypt to monitor communications for prostitution or sex trafficking, it doesn't seem too much of a leap to view that as a form of facilitation - and if it's been brought to your attention that this might be happening, does that not make it knowing facilitation?
Of course, for once, those sites which aren't based in the USA might have an easier time of it. If anything this will drive more services overseas - perhaps to Europe, because after all they're already at risk of violating EU law if they take personal data out of the EU. Besides, it's cheaper.
It'll be a lot trickier running any kind of U.S.-based social network under these circumstances, let alone whether it's sex- or relationship-oriented. After all, any general-purpose groups communication, advertisement or messaging system may be used for the kind of communication targeted by this legislation, which previously they were not liable for.
Any good moderator also knows that matters of user conduct are seldom black and white - indeed, it is the cases where you try to make the right call but it goes wrong which are likely to be the most troublesome; if you have suspicions but allow activity to continue, that could be viewed as facilitation. It's almost better not to moderate at all. Users also seldom welcome intrusion into their private messages, let alone revealing personal real-life information to staff to identify a bona fide relationship.
I suspect this is, in part, aimed at those services which have sought to use end-to-end encryption. If you provide a service which you can't decrypt to monitor communications for prostitution or sex trafficking, it doesn't seem too much of a leap to view that as a form of facilitation - and if it's been brought to your attention that this might be happening, does that not make it knowing facilitation?
Of course, for once, those sites which aren't based in the USA might have an easier time of it. If anything this will drive more services overseas - perhaps to Europe, because after all they're already at risk of violating EU law if they take personal data out of the EU. Besides, it's cheaper.