Skip to content

User-generated content and moderation boundaries

What this page is

A neutral framing of how responsibility is typically discussed when a bot handles user-generated content (UGC), such as messages, files, or links.

What this page is not

  • A universal moderation policy
  • A claim that content can be reliably filtered
  • Legal advice

Definitions and scope

  • UGC: content created or provided by users.
  • Moderation: actions that limit visibility, access, or distribution of content.

Decision points

  • Whether the bot stores or republishes user-provided content
  • Whether the bot provides amplification (broadcasting, forwarding, indexing)
  • Whether the bot applies any filters or refusals
  • Whether and how moderation actions are appealed or reviewed

Responsibility boundaries

  • When a bot republishes or amplifies UGC, the operator typically controls design choices that affect reach and persistence.
  • When a bot only relays content within a chat context, control may be more limited, but design choices (like defaults and logging) still matter.
  • Users typically control what they submit, but not necessarily how the bot stores, redistributes, or annotates it.

Typical evidence to document

  • Whether the bot stores UGC, and retention periods
  • Whether the bot republishes content outside the original chat
  • Moderation triggers at a high level (avoid procedural detail)
  • Audit trail practices for moderation actions

Open questions

  • Does the bot copy content to external storage?
  • Is UGC searchable or indexed outside the original conversation?
  • Are there administrator controls for removing or restricting content?