Web Vitals field data: Use cases, benefits and limits of the different field data in Google Search Console, Chrome UX Report, Pagespeed Insights and the JavaScript library.
Comparison of the current (07/2020) options with Web Vitals field data and what you can use the different data for.
Check the video about this blogpost
Field data and lab data
In the past we all used test lab data for Pagespeed. That’s cool too but more useful during development (more about lab data). There are some problems with lab data:
- Network conditions / CPU consumption can differ
- Your site is dynamic. Sometime elements will load sometimes not. E.g. Ads
- Browser and devices of your users behave different than your test device
So now, with Web Vitals you want field data of real users. Learn more about what Web Vitals here https://web.dev/vitals/
Currently there are different tools for field and lab data. I miss the JavaScript library here but it’s an useful overview:
Let’s compare different field data options:
1) Web Vitals field data in Google Search Console
Field data in Google Search Console is amazing
- It’s YOUR data. Daily collected automatically without any additional setup or coding
- The dimension is URLs + an aggregated metric (LCP, CLS or FID)
- You can check mobile and desktop
Limits with Web Vitals in Google Search Console
- If you want to check on user, pageview or session level that’s not possible. It’s just affected URLs.
- It’s just daily and with 2 days delay for now.
- With big sites it’s not possible to get all the affected URLs as an export
- If your site is too small you don’t get this. Field data in GSC needs a critical mass of users with Chromium Browsers visiting your site.
Use case
- a good starting point in field data over time. URL level. Your data
- no setup effort
2) Web vitals field data in Chrome UX Report
The Chrome UX Report is a public dataset with data from 18 million website about how they are doing in Speed and UX. It offers data from late 2017 till now.
Its’s available in BigQuery and Chrome UX Report API
Usage with BigQuery
- There is raw data per country and in materialized some summery data
- With the raw data you can check data daily
- It’s not live but with some delay. 1–2 months in my tests
- You can filter countries, connection types, form_factors
There is a Data Studio option to access the data, too:
Usage with Chrome UX Report API
- daily updated data
- less filter options than big query e.g. you can’ go back in time
You can easy access it from JavaScript, PHP, … or e.g. Google Apps Script
To create something like this in Google Sheets
Limits with Chrome UX Report API
- The 28-day rolling average of the API data makes effects visible with quite a big delay. Same with the field data in Pagespeed Insights.
- BigQuery data does not have this rolling average, but a delay (please comment if I’m wrong here)
- Small site are out again
- If you want to do not just the basic stuff in Data Studio, you probably need some BigQuery or API usage skills.
Use case
- Check competitors, market, country differences
- It’s research / reporting rather than operations
3) Pagespeed Insights field data
The data shown in Pagespeed Insights field data is also a 28-day rolling average. If you want to check effect of you improvements here it will take 28 days to see them fully
Use case (of the field data in Pagespeed Insights)
- e.g. one time check when you start with a new client
4) Field data you track yourself with the JavaScript library
If you really want to know over time what’s happening to your site track the 3 KPIs with the JavaScript library and push them e.g. with Google Tag Manager to Google Analytics as events. Check:
Currently that’s the best near realtime option to track how YOUR site/users are affected from e.g. a current deployment* or some recent problems. It’s the only option to see something the same day. It’s the fastest web vitals feedback loop from real user.
If you collect the data in GA you connect the events with all other GA metrics. Check out here the Avg. Value for LCP is in milliseconds with a secondary dimension Country
I recommend to create additional dashboard with the collected data. Probably you want to calculate shares of poor and good measures.
To compare use the good/needs improvements/poor events for LCP, FID and CLS in Chrome vs. all Chrome, because Web Vitals are not trackable in all Browser (and to find out which browsers are Chromium isn’t so easy with browsers like “ZhihuHybrid DefaultBrowser osee2unifiedRelease” in GA)
If you want to lower hit count in GA 360 maybe just push the poor measurements.
This is LCP just pushed with more than 4 Seconds / poor LCP
You will lower hit count even more if you improve the KPIs 😉
Use case
- The best and most recent data you can get for your site. If you want do serious Web Vitals improvements you seek this data
Useful material
SQL for CrUX in BigQuery
Web.dev Live has some interesting videos about web vitals data. E.g.
*You should of course use lab data + test before deployment to lower the cases of more poor scores you have to learn about with field data