[Posted by Steven Noble]
Content moderation — deciding what user-generated content to publish and what to delete — is a serious business. Get it right, and your social application may well flourish. Make mistakes, and you’ll alienate the very people you were hoping to engage. An important step along the way is crafting a clear and trustworthy content moderation policy. That’s the topic of my first Forrester Research report.
However, there’s much more to content moderation than simply creating and implementing your policy. If your social application is large or complex, expect to make significant investments in moderation software, in labour or services, and in managing the whole shebang. These back-end issues — from picking the right vendors to measuring outcomes — will be the subject of future Forrester research.
So, if you’re deeply involved in content moderation — whether as a marketer or manager, a software vendor, or a provider of moderation services — now is the time to suggest a briefing. Likewise, please feel free to share your experiences in the comments field below.