DataSlush

DataSlush

Beyond Accuracy: Uncovering the Value of Recommendation Coverage Metrics with Google Analytics

Welcome to the wonderful world of OTT platforms, where you can binge-watch your favorite shows and movies to your heart’s content. But have you ever wondered how these platforms decide which shows to recommend to you? That’s where recommendation engines come in. These engines use machine learning algorithms to analyze your viewing history, search queries, and other data to suggest the most relevant content for you.

Just like a blind date, not all recommendations are a perfect match. Sometimes you end up with a dud, and sometimes you hit it off right away. That’s why it’s important to measure recommendation coverage metrics in OTT web analytics. By tracking these metrics, you can make sure your recommendation engine is making a good match between users and content. So, let’s swipe right on this topic and explore recommendation coverage metrics in more detail. 

In the era of intelligent machines, we often spend a lot of time training models and measuring their accuracy on various benchmarks. However, once these models are deployed “in the wild” and begin interacting with real-world users, a new set of metrics becomes necessary. This is particularly true for recommendation engines in OTT platforms.

While we might have a well-trained machine-learning model that performs well on internal benchmarks, it’s important to remember that the goal of the recommendation engine is to help users find content that they enjoy. To measure how well the recommendation engine is achieving this goal, we need to look beyond traditional machine learning metrics and consider recommendation coverage metrics.

Lights, Camera, & Metrics

Firstly, let’s see the various stages of the user journey on an OTT platform, from the homepage to content categories, content details, and related and recommended content.

Note: Below is an ideal journey of a user, this journey depends on the structure of the OTT platform.

Based on the above user journey, we can divide the Recommendation Coverage Reports into some Key Metrics

  • Click-through rates (CTR): This metric measures the percentage of viewers who clicked on a recommended piece of content. A higher CTR indicates that the recommendation algorithm is doing a good job of matching viewers with the content they’re interested in.
  • View-through rates (VTR): This metric measures the percentage of viewers who watched a recommended piece of content after clicking on it. A higher VTR indicates that the recommendation algorithm is not only matching viewers with the content they’re interested in but also presenting it in a way that is compelling enough to get them to watch.
  • Drop-off rates: This metric measures the percentage of viewers who stopped watching a recommended piece of content before finishing it. A high drop-off rate could indicate that the recommendation algorithm is not doing a good job of matching viewers with content that’s relevant to their interests, or that the recommended content is not living up to viewers’ expectations.
    Note: Drop-off rates can be biased if they don’t take into account cases where users return to the content later and continue watching it. To reduce this bias, it’s important to track not just drop-off rates but also completion rates, which measure how many users actually finish watching the content. Additionally, it may be helpful to track the amount of time users spend away from the content before returning to it, as this can provide insights into how engaging the content is and how likely users are to return to it after a pause. 
  • Recommendation coverage: This metric measures the percentage of available content that is being recommended to viewers. A low recommendation coverage could indicate that the recommendation algorithm is not considering all available content, which could limit viewers’ exposure to new or niche content.
  • Diversity of recommendations: This metric measures the variety of content being recommended to viewers. A higher diversity of recommendations could indicate that the recommendation algorithm is doing a good job of presenting viewers with a range of content that matches their interests and preferences.
  • Personalization effectiveness: This metric measures the percentage of recommended content that is actually watched by individual viewers. Higher personalization effectiveness indicates that the recommendation algorithm is doing a good job of understanding viewers’ preferences and presenting them with content they’re likely to enjoy.

By tracking the above metrics, OTT platforms can better understand how well their recommendation algorithms and personalization features are working, and identify areas for improvement. These metrics can also help product managers and developers optimize the recommendation algorithms to drive higher engagement and viewership.

How OTT platforms can track these metrics using Google Analytics?

The custom events and parameters should be pushed to the dataLayer from the pages or components where the corresponding user actions or events occur.

For example:

Click-through rates (CTR) and view-through rates (VTR) should be pushed when a user clicks on a recommended content item or views it for a certain amount of time.

// Click-through rates (CTR)
dataLayer.push({
  'event': 'recommendation_clicked',
  'recommended_content_id': 'abc123',
  'content_type': 'movie',
  'recommendation_algorithm': 'collaborative_filtering',
  'click_position': 3
});

// View-through rates (VTR)
dataLayer.push({
  'event': 'recommendation_viewed',
  'recommended_content_id': 'def456',
  'content_type': 'tv_show',
  'recommendation_algorithm': 'content-based_filtering',
  'view_position': 1,
  'watch_time': 120
});

Drop-off rates should be pushed when a user stops watching a content item before the end.

// Drop-off rates
dataLayer.push({
  'event': 'content_dropoff',
  'content_id': 'ghi789',
  'content_type': 'documentary',
  'watch_time': 300,
  'dropoff_time': 180
});

Recommendation coverage and diversity of recommendations should be pushed when recommendations are generated for a user.

// Recommendation coverage
dataLayer.push({
  'event': 'recommendation_coverage',
  'recommendation_algorithm': 'hybrid_filtering',
  'total_available_content': 1000,
  'recommended_content': 200
});

// Diversity of recommendations
dataLayer.push({
  'event': 'recommendation_diversity',
  'recommendation_algorithm': 'popularity-based_filtering',
  'total_recommended_content': 50,
  'unique_content_count': 30
});

Personalization effectiveness should be pushed when a recommended content item is viewed by a user.

// Personalization effectiveness
dataLayer.push({
  'event': 'personalization_effectiveness',
  'user_id': 'user123',
  'recommended_content_id': 'jkl012',
  'content_type': 'tv_show',
  'view_time': 600
});

Implementing this type of tracking can be challenging, and we at DataSlush can assist here. We are focused on helping organizations successfully complete their data initiatives and get value out of them.

Author

  • Darsh Shukla

    As a Data Solution Architect, I am dedicated to harnessing the power of data to minimize business friction. With a degree in Computer Engineering and proven experience in data-driven roles, I specialize in designing and constructing efficient data pipelines, applications, and solutions for production environments.

    View all posts