Analytics for the Publishing Industry: Content Intelligence, the best solution!

Perfect your content strategy with Artificial Intelligence

I’m not even going to explain what content marketing is. This is because surely you will already be aware of the extremely important role that content covers in accompanying consumers on their buyer journeys. Indeed, the number of organizations that adopt data-driven approaches is on the rise. They do so with the support of AI in order to be “guided” in the production of increasingly effective content.

But, since companies use algorithms to give life to a content strategy that is aligned with the user, the publishing industry, which makes a living from the production of content, should already have known for some time how to optimize the value of its work from a data-driven perspective. In reality, the situation is very different and Jon Wilks at Content Insights.

During his “Data Insights for Journalists” presentation, Wilks pointed out that, although editorial offices purchase various analytical tools, the editors that use them don’t really know what they’re doing. Indeed, in the industry, the difference between Analytics and Editorial Analytics remains an unknown. Data & analytics are now on everyone’s lips but on the contrary, with the words “Editorial Analytics” only 4,020 Google search results come up.


A Bit of History 

First, let’s try to understand how we got to this point.
Thomas Davenport of the International Institute for Analytics identified four “eras” of analytics:

  • Early Era Analytics

The “official” launch of analytics occurred in the 1950s but the first departments were not taken seriously as they were segregated to “back rooms” far from the halls of power where business decisions were actually made on the basis of experience and intuition in particular (and not on data analysis).

In the Nineties, with the advent of the Internet, people began to recognize the importance of the role of analytics. In 1997, Urchin appeared, the first analytics package on the market. It then became Google Analytics (GA) in 2005, which gathers 83% of the analytical market. A real “must-have” for marketers, given that it predicts and follows the funnel, it brought the publishing industry towards a dangerous abyss instead: that of following only “single metrics”.

  •  Sexy Time

With the advent of social media and real-time analytical platforms, data became seductive and, for the first time, turned into “live” indicators of popularity, allowing companies to “act” in response.

Starting from 2010, with the huge flows of data that were being created, there was a real arms race on behalf of companies. At this point, there was a disconnection from the publishing industry. The latter focused dangerously only on single metrics (likes and shares, page impressions, page views, average pages per visit, scroll depth, etc.), which do not give a full overview of reader behavior. Trying to monetize on such futile parameters is madness.

Real-time metrics are great if you are a front-end editor and you want to maximize your clicks at the moment, but this tactic is flawed in the long term. If one of the trending topics is Trump, you can use it in the short term, but you cannot construct an editorial strategy from it. Wilks cites the example of the fishing magazine that shows Trump fishing for tuna. Of course, a highly followed and read article, but then what? You can’t just publish Trump.

How can we turn the publishing industry into a “data-powered” one, where the integrated use of data feeds the decision-making process? Single metrics, if taken on their own, are not able to represent the genuine engagement of the reader.


 Data-driven, why? Find out!

The Future of Analytics for Publishing: Content Intelligence

The solution is Content Intelligence, or rather, Artificial Intelligence applied to content. Here is a list of its advantages:

  • automatic classification of all content: the classification activity is impossible to manage with manual intervention alone and risks producing inconsistent results. With the rationalization operated by AI, which identifies the topics in each piece of content through machine learning, speech-to-text and image recognition features, editorial offices find themselves working with an ordered content archive where it is easy to find what you are looking for and optimize its value.
  • AI engines associate the tags relating to content topics with those people, even anonymous ones, who have viewed them, giving us valuable data on their interests. This has a dual advantage:
  1.  knowing which content is performing best and increasingly perfecting your content strategy as a result.
  2.  personalizing the communication that is addressed to each reader. Indeed, CI is able to obtain a complete overview of their behavior. This is used to feed automation systems, with the aim of showing the user the most relevant content for their interests, on the right device and at the right time.