Subscription Retention

About this chart

The Subscription Retention chart tracks how many paid subscriptions remain active over time after their initial purchase. It helps you measure subscriber loyalty, identify when users tend to churn, and evaluate the long-term impact of your acquisition and retention strategies.

⚠️

Change from dashboard v1: counting subscriptions, not subscribers

The previous version of this dashboard counted unique subscribers (users). Dashboard v2 now counts unique subscriptions, each identified by a unique subscription ID.

This changes the numbers in two ways:

  • Restored subscriptions across devices: In v1, when a subscription was restored on a new device by a different anonymous user, it was counted multiple times — once for each anonymous user associated with it. In v2, the subscription is counted only once regardless of how many devices or anonymous users it passes through.
  • Multiple subscriptions per user: In v1, a user holding two active subscriptions simultaneously was counted once (one user). In v2, each subscription is counted individually, so the same user contributes two to the total.

Example: Alice holds both a monthly Music plan and a yearly Premium plan. In v1, Alice counted as 1 subscriber. In v2, she counts as 2 active subscriptions. Conversely, if a single subscription was restored across 3 anonymous devices in v1, it appeared as 3 subscribers — in v2 it correctly counts as 1 subscription.

How to read the chart

The chart is a cohort retention table. Each row represents a group (cohort) of subscriptions that started during the same time period.

ColumnDescription
SegmentThe time period when the subscriptions in this cohort started (e.g., "2026, day 083" in daily view, or "2026, week 12" in weekly view)
Paid subscriptions (started and reactivated)The initial size of the cohort: how many paid subscriptions started or were reactivated during that period
RevenueTotal revenue generated by this cohort during the selected period
ARPPUAverage Revenue Per Paid Subscription for this cohort (Revenue / Paid subscriptions)
Day 0, Day 1, Day 2...How many subscriptions from this cohort are still active after N periods (or Week 0, Month 0, etc. depending on granularity)

The diagonal staircase pattern is normal: recent cohorts have fewer columns because not enough time has elapsed to measure later periods.

Controls

Show

Switch between two display modes:

ModeWhat it shows
Subscriptions retained (rates)Percentage of the cohort still active at each period (e.g., 96.7% means 96.7% of the original cohort is still subscribed)
Subscriptions retained (numbers)Absolute count of subscriptions still active at each period

Segment

Focus on a subset of subscriptions:

SegmentDescription
All subscriptionsIncludes all paid subscriptions: trials that converted, intro offers, promo offers, and direct full-price purchases
Full-price subscriptionsOnly subscriptions that started at full price (excludes trials, intro offers, and promo offers)

Granularity

Use the Daily / Weekly / Monthly selector to control the time resolution of cohorts and retention periods.

  • Daily — One cohort per day, retention measured day by day. Best for short-term analysis (days to weeks).
  • Weekly — One cohort per week, retention measured week by week. Good for spotting weekly patterns.
  • Monthly — One cohort per month, retention measured month by month. Best for long-term trends.

Filters

Click Filters to narrow the data. You can combine multiple filters.

FilterDescription
PlatformsiOS, Android, or both
Plan periodicityFilter by billing cycle (weekly, monthly, yearly, etc.)
Offer typesFilter by how the subscription started (trial, intro offer, promo offer, standard)
CountriesFilter by user country
ScreensFilter by the paywall screen that triggered the subscription
PlacementsFilter by where in the app the paywall was shown
AudiencesFilter by audience segment the user belonged to at purchase time
PlansFilter by specific subscription plan
A/B testsFilter by A/B test the user was enrolled in
CampaignsFilter by campaign attribution

Common use cases

  • Compare trial vs. full-price retention — Use the Segment dropdown to switch between "All subscriptions" and "Full-price subscriptions". If full-price retention is significantly higher, your trials may be attracting lower-intent users.
  • Measure the impact of a paywall change — Filter by Screen or A/B test and compare cohort retention before and after the change.
  • Identify platform differences — Filter by Platform to see if iOS and Android subscribers retain differently.
  • Spot billing cycle issues — Look for sharp drops at Day 7 (weekly plans), Day 30 (monthly plans), or Day 365 (yearly plans). These indicate renewal failures or intentional cancellations.

Frequently asked questions

Why do weekly cohort totals not match the monthly total?

When you add up the "Paid subscriptions" column across all weekly cohorts for a given month, the total may be higher than the single monthly cohort for the same period. This is not a data error.

The reason: when a subscription payment fails, it enters a billing retry period — the app store retries the charge over several days. If the subscription churns in week 10 and recovers in week 11, it appears as "started" in one cohort and "reactivated" in the other. The same subscription is counted twice across two weekly cohorts. In monthly view, both events fall within the same month, so it is counted only once.

Example: the six weekly cohorts covering March 2026 total 893 paid subscriptions (35 + 206 + 211 + 192 + 215 + 34). But the monthly March 2026 cohort shows only 835. The 58 extra subscriptions were counted in two weekly cohorts due to billing retry spanning across weeks.

This applies to any granularity comparison: daily totals won't match weekly totals, and weekly totals won't match monthly totals.

Why does retention look higher when I switch to a wider granularity?

For the same reason. When a subscription churns and recovers within the same time bucket (e.g., within the same month), the net effect is zero — it never appears to have churned.

Example: a subscription enters billing retry on Jan 5 and recovers on Jan 20.

  • Weekly view: churn and recovery land in different weeks — retention dips then recovers.
  • Monthly view: both events land in January — the subscription never appears to have churned. Retention stays flat.

The wider the time bucket, the more of these temporary churn-and-recovery pairs become invisible. This is standard behavior across analytics tools.

Which granularity should I use?

  • Daily when you need precision on short-term retention and accurate cohort counts.
  • Weekly for week-over-week trend analysis with a good balance of precision and readability.
  • Monthly for long-term trends and executive reporting — but keep in mind that cohort sizes and retention rates will be slightly optimistic due to the billing retry smoothing effect described above.

As a rule of thumb: don't try to reconcile totals across granularities. The differences are an expected artifact of billing retry, not a data discrepancy.