The Need for Content Administrators on Digital Platforms

With the rapid growth of digital platforms, there’s an increasing risk of illegal content being shared, such as non-consensual and underage material. Appointing a dedicated Content Administrator could ensure compliance with laws like FOSTA-SESTA and prevent these harmful activities.

Discussion Points:

  1. How can platforms balance user privacy with the need for content oversight?
  2. What role should a Content Administrator play in ensuring both compliance and user trust?
  3. Can legislation support content moderation without stifling free speech?

What are your thoughts on this? Would appointing a Content Administrator make platforms safer or create more challenges?

The current strategy of blurring images marked as “sensitive” by default is a good compromise between legal content and sensitive viewers.

Illegal content is already covered by various laws and the website itself is protected by Safe Harbor (at least in the US) which means that the website cannot be found liable for illegal material if they have a process for taking it down after an official legal takedown request. Twitter (X) enjoys the same protections - you can still find illegal content on Twitter, but it is taken down very fast when reported.

I don’t think Dexie should get into the business of censorship. I do think they should obey all applicable laws regarding illegal content. But applying censorship above and beyond the state’s censorship is what has gotten us into the mess of the last decade.

1 Like