Introduction
Google’s Chrome User Experience (CrUX) data is a valuable and unique public resource if you’re aiming to improve your website’s SEO performance or benchmark your UX with competitors. Real-User Monitoring (RUM) tools on the other hand provide a detailed view of your website’s user base, allowing you to analyse individual customer journeys and thereby revealing concrete UX optimization potential or hints on technical problems. As both tools seem to play an important part in maintaining a successful website, they are frequently used side by side. When comparing the UX metrics that are captured by both, however, you will likely find that CrUX and RUM measurements contradict one another more often than not. Does this mean either one is at fault? Surprisingly, the answer is no. In this post we’ll give a detailed explanation of why CrUX and RUM often report contradicting performance measurements and why you should still keep an eye on both.
Of course, we’ll start with a brief introduction to CrUX and RUM. However, if you are looking for more details on those tools or on-site speed measurement best practices in general, check out Part 4 of our ongoing study on “Mobile Site Speed & The Impact on Web Performance” [1].
RUM — A holistic UX tracking setup
Real-User Monitoring (RUM) tools like mPulse, Data Dog, or Dynatrace collect detailed information about the user’s experience, measured directly in the browser. The focus thereby lies on several timers that express, for example, how long it took until the user saw the first content (First Contentful Paint) or until the page reacted to the first user interaction (First Input Delay).
The high cost of integrating such tools into your website comes with the benefit of capturing detailed information about every single page impression (e.g., browser, page type, timestamp, country). This enables extensive in-depth analysis of your website’s performance, including the exploration of user journeys and the identification of tangible optimization opportunities or technical issues.
CrUX — A zero-effort UX & SEO indicator
The Chrome User Experience (CrUX) report is based on data from Google’s own RUM tool that is integrated into Google’s Chrome browser: To leverage CrUX, you thus do not have to set up, maintain, or pay anything. Google is measuring your website’s performance automatically and provides the results via several outlets like the CrUX API or Dashboard (see Figure 1) [2].
Compared to your own RUM data, however, CrUX data only contains Chrome-based user experiences, provides fewer metrics, and is way less granular. For example, the data is aggregated over at least 28 days and only very limited user-related information is provided, excluding valuable insights like the visited page type (e.g., checkout vs. product detail page), the operating system, browser version, and many more.
Nevertheless, it plays an important role in optimizing your website’s SEO performance: Google uses the CrUX data itself to rank Google Search results [3]. This makes the CrUX data set one of the most relevant indicators for the success of your Google Search Engine Optimization (SEO) efforts. Furthermore, for websites that don’t have their own RUM in place, CrUX provides fast and actionable UX data out-of-the-box and allows benchmarking of your UX performance with competitors.
Why CrUX performance cannot be replicated with RUM
In the previous sections, we have outlined how RUM and CrUX differ and how they can complement each other when aiming to run a successful website. However, when using both CrUX and RUM, you will very likely run into the issue that both data sources report conflicting performance metrics — even if you only compare Chrome results and use the same aggregation (e.g. 75th percentile of the First Contentful Paint of the past 28 days).
But fear not! That doesn’t mean that any of them is wrong. They are simply not comparable because of Google’s custom tracking logic, which cannot be fully reconstructed with your RUM tracking.
Google’s CrUX tracking applies unreproducible filters
Let’s have a look at the Google CrUX tracking funnel in Figure 2 to understand what user group the CrUX performance numbers actually refer to.
Filter 1: Only Chrome experiences (excluding WebViews)
We already established that CrUX data only contains traces from Chrome users. As confirmed by Google, this also means that Chrome-based WebView traffic is not included (app traffic from Android devices) [4]. RUM data on the other hand usually covers all browsers, including WebViews.
As your RUM probably tracks the information on what browser has been used, this difference in tracking methodology could actually be resolved by filtering your RUM data accordingly. But as we have mentioned already, this still does not lead to the same results due to the following filters that are applied in Google’s user tracking funnel.
Filter 2: Only logged-in users
In contrast to RUM data, CrUX data is only collected for Chrome users who are logged into their browser with their Google account. This is a mandatory requirement, because CrUX’s data tracking is only possible given the user’s consent, as declared in the Google account settings [2]. Due to data privacy, this information is exclusively accessible to Google: In other words, the log-in state of a user is not accessible to custom RUM solutions.
In consequence, we are now already at a point where comparability between your holistic RUM data and CrUX data is conceptually impossible.
Filter 3: Only users with active browser history sync and no passphrase
Another condition for Google’s CrUX tracking is a user opt-in for syncing the browser history, configured via the personal Google account settings [2][5]. Only if the user has set up this sync and renounced to set a passphrase for it, Google is allowed to read their data [6].
Filter 4: Only users with enabled usage statistic reporting
Lastly, the logged-in Chrome user with active browser history sync without passphrase is only included in Google’s CrUX data, if they did not disable the usage statistic reporting, which is enabled by default [2][7].
Side effect: Filters cause CrUX data to be biased
Not only are most of Google's filters not reproducible. They also cause the CrUX results to be unrepresentative in regards to your website's audience, as the reported subset is highly biased (only Chrome Users with specific browser and privacy settings).
And there is even more…
We’ve discovered that on several occasions the Google CrUX data had not been up-to-date for several days without indicating the delay in the different outlets (e.g., in the CrUX API) [8]. Even though we don’t expect this to be a constant issue, this might still potentially lead to temporary outdated results and should be mentioned here for the sake of completeness and transparency.
Optimization sweet spots: CrUX for SEO, RUM for overall UX
So far we have established that Google uses numerous filters in their user tracking which cannot be replicated by your RUM tool due to missing information. That does not mean that your RUM is inferior, though. In fact, when considering all the mentioned limitations to which Google’s CrUX data is subject, several advantages of your RUM data emerge:
- It is built upon a way bigger foundation, as it does not exclude as many users as CrUX does.
- It is not skewed in the way CrUX data is due to focusing on a more homogeneous user group (CrUX only considers Chrome users with specific personal settings).
- It is provided on a way more granular level, thus allowing more precision in evaluating and optimizing your performance (e.g., by page type or date).
Ultimately, being considerate about when to use the CrUX data set over your RUM data is very important: While CrUX is a valuable indicator for your SEO performance, using your RUM data is most beneficial for all further UX optimizations.
Conclusion: You cannot validate your RUM results with CrUX data
Googles CrUX report only shows the user experience results for a subset of your users, those who:
- are using Chrome (except WebViews),
- are logged in with their Google Account,
- actively allowed Google to sync their browsing history,
- don’t prohibit Google to read the synced data,
- and didn’t disable the usage statistic reporting.
Your RUM data is based on a much bigger audience, as its tracking is not as restricted in the way Google's CrUX tracking is. At the same time, your RUM tracking does not have access to your users’ personal Google account settings (e.g., if the user enabled their browsing history sync). Thus, you are not able to filter the same subset of users in order to get the exact same results as Google.
That ultimately yields two conclusions. First, contradicting performance results alone do not qualify as an indicator for either CrUX or RUM being incorrect: They simply report performance measurements that are inherently incomparable. Second and more importantly, though, both CrUX and RUM are invaluable for optimizing SEO and UX performance — as long as the mentioned differences and limitations are taken into account when interpreting the results.
References
[1] Wolfram Wingerath, Mobile Site Speed & The Impact on Web Performance, Baqend Blog, 2020
[2] Chrome User Experience Report, Tools for Google Developers, 2022
[3] Addy Osmani, Ilya Grigorik. Speed is now a landing page factor for Google Search and Ads, Google Developers Blog (Web Updates), 2018
[4] Does CRuX report send data from WebView on Android devices?, Google Groups Discussion (Chrome UX Report), 2021
[5] Turn sync on and off in Chrome, Google Chrome Help Center, 2022
[6] Get your bookmarks, passwords and more on all of your devices, Google Chrome Help Center, 2022
[7] Usage statistics and crash reports, Google Chrome Privacy Whitepaper, 2021
[8] Stalled data since October 1st, Google Groups Discussion (Chrome UX Report), 2021