You’re hearing with increasing frequency that social listening platforms (SLPs) are incorporating artificial intelligence and that AI may eventually unburden humans from heavy social data management and analysis. The reality: This is exaggerated. You still need a hefty amount of human involvement to set the rules, train the data, and maintain the platform. In researching this tech category in our “Now Tech: Social Listening Platforms, Q2 2018” and “The Forrester Wave™: Social Listening Platforms, Q3 2018” reports, we uncovered how AI manifests in SLPs:

  • Fully enabled AI technology must sense, think, and act. SLPs sense and think but aren’t fully acting yet. To start, SLPs sense by mining and processing troves of social data — and fast. Then, they think by learning, clustering, and surfacing relevant themes, topics, and audience segments. Finally, they act, but that action is largely limited to detecting social conversation anomalies and subsequently triggering alerts. SLPs are not in fully autonomous mode just yet; humans are still required to make decisions and act on the data and insights that SLPs surface.
  • Rules, machine learning, and AI are different methods to achieve automation. SLPs use a combination of all three. Rules involve humans setting the rules, and an algorithm takes over from there. Machine learning enables computers to learn the best rules to meet a specific objective based on historical data. AI strives to mimic human intelligence through experience and learning via the previously mentioned sense, think, and act modes.

Check out our new report, “Demystify Artificial Intelligence In Social Listening Platforms,” to learn more about what automation — not just AI — really means in this technology.

Many thanks to Arleen ChienBrigitte Majewski, Srividya Sridharan, and Caitlin Wall for their great work on this report.