Skip to main content


Updated

Collecting data on the vital web elements of your site is the first step to improving them. A full analysis will collect performance data from lab and real-world environments. Web vitals measurement requires minimal code changes and can be accomplished using free tools.

Web Vital Values Measurement Using RUM Data

Real user monitoring (RUM), also known as field data, captures the performance experienced by actual users of a site. RUM data is what Google uses to determine if a site meets Core Web Vitals recommended thresholds.

Starting

If you don't have a RUM setup, the following tools will quickly provide you with data on the actual performance of your site. All of these tools are based on the same underlying data set (the Chrome User Experience Report), but they have slightly different use cases:

  • PageSpeed ​​Insights (PSI): PageSpeed ​​Insights reports aggregate performance at the page level and source level for the last 28 days. In addition, it offers suggestions on how to improve performance. If you are looking for a single action to start measuring and improving your site's web vital values, we recommend that you use PSI to audit your site. PSI is available in Web and as a API.
  • Search Console: Search Console reports performance data per page. This makes it suitable for identifying specific pages that need improvement. Unlike PageSpeed ​​Insights, Search Console reports include historical performance data. Search Console can only be used with sites that you own and have verified ownership of.
  • CrUX dashboard: CrUX board is a pre-built dashboard that displays CrUX data for a source of your choice. It is built on top of Data Studio and the setup process takes about a minute. Compared to PageSpeed ​​Insights and Search Console, CrUX dashboard reports include more dimensions; for example, the data can be broken down by device and connection type.

It is worth noting that while the tools listed above are suitable for getting 'started' with measuring Web Vitals, they can also be useful in other contexts. In particular, both CrUX and PSI are available as APIs and can be used to build panels and other reports.

RUM data collection

Although CrUX-based tools are a good starting point for researching Web Vitals performance, we highly recommend supplementing them with your own RUM. The RUM data that you collect yourself can provide more detailed and immediate feedback on the performance of your site. This makes it easier to identify problems and test possible solutions.

CrUX-based data sources report data using a granularity of approximately one month; however, the details of this vary slightly by tool. For example, PSI and Search Console report observed performance over the last 28 days, while the CrUX dashboard and dataset is broken down by calendar month.

You can collect your own RUM data by using a dedicated RUM provider or by setting up your own tools.

Dedicated RUM providers specialize in collecting and reporting RUM data. To use Core Web Vitals with these services, ask your RUM provider about enabling Core Web Vitals monitoring for your site.

If you don't have a RUM provider, you may be able to augment your existing analytics setup to collect and report on these metrics using the web-vitals JavaScript library. This method is explained in more detail below.

The web-vitals JavaScript library

If you are implementing your own RUM configuration for Web Vitals, the easiest way to collect measurements from Web Vitals is to use the web-vitals JavaScript library. web-vitals is a small modular library (~ 1KB) that provides a convenient API to collect and report each of the Web Vitals metrics measurable in the field.

The metrics that make up Web Vitals are not all directly exposed by the browser's built-in performance APIs, but are based on them. For example, Accumulative Layout Shift (CLS) is implemented using the Design instability API. By using web-vitals, you don't need to worry about implementing these metrics yourself; it also ensures that the data it collects matches the methodology and best practices for each metric.

For more information about the implementation web-vitals, refer to documentation and the Best Practices Guide for Measuring Web Vitals in the Field.

Data aggregation

It is essential that you report the measures collected by web-vitals. If this data is measured but not reported, you will never see it. the web-vitals The documentation includes examples that show how to send the data to a generic API endpoint, Google analyticsor Google Tag Manager.

If you already have a favorite reporting tool, consider using it. Otherwise, Google Analytics is free and can be used for this purpose.

When considering which tool to use, it helps to think about who will need to have access to the data. Typically, companies get the best return when the entire company, rather than a single department, is interested in improving performance. See Cross-site speed correction for information on how to get buy-in from different departments.

Data interpretation

When analyzing performance data, it is important to pay attention to the queues of the distribution. RUM data often reveals that performance varies widely - some users have fast experiences, others have slow experiences. However, using the median to summarize the data can easily mask this behavior.

For Web Vitals, Google uses the percentage of "good" experiences, rather than statistics such as medians or averages, to determine if a site or page meets recommended thresholds. Specifically, for a site or page to be considered to meet Core Web Vitals thresholds, 75% of page views must meet the "good" threshold for each metric.

Web Vital Values Measurement Using Laboratory Data

Lab data, also known as synthetic data, is collected from a controlled environment, rather than from actual users. Unlike RUM data, lab data can be collected from pre-production environments and therefore can be incorporated into developer workflows and continuous integration processes. Examples of tools that collect synthetic data are Lighthouse and WebPageTest.

Considerations

There will always be discrepancies between RUM data and lab data, especially if network conditions, device type, or lab environment location differ significantly from those of users. However, when it comes to collecting lab data on Web Vitals metrics in particular, there are a couple of specific considerations that are important to keep in mind:

  • Cumulative Design Change (CLS): The cumulative design change measured in laboratory settings may be artificially lower than the CLS observed in the RUM data. CLS is defined as the 'sum total of all individual design change scores for each unexpected design change that occurs throughout the life of the page. “However, the lifespan of a page is often very different depending on whether it is loaded by a real user or a synthetic performance measurement tool. Many lab tools only load the page, they don't interact with it. As a result, they only capture layout changes that occur during the initial page load. In contrast, the CLS measured by RUM tools captures unexpected design changes that occur throughout the life of the page.
  • First Entry Delay (FID): First post delay cannot be measured in lab environments because it requires user interactions with the page. As a result, Total Blocking Time (TBT) is the recommended lab proxy for FIDs. TBT measures the "total amount of time between First Contentful Paint and Time to Interactive during which the page cannot respond to user input." Although FID and TBT are calculated differently, they are both reflections of a blocked main thread during the boot process. When the main thread is blocked, the browser is slow to respond to user interactions. FID measures the delay, if any, that occurs the first time a user tries to interact with a page.

Stamping

These tools can be used to collect laboratory measurements from Web Vitals:

  • Chrome Web Vitals extension: The Web Vitals Chrome extension measures and reports Core Web Vitals (LCP, FID, and CLS) for a given page. This tool is intended to provide developers with real-time performance feedback as they make changes to code.
  • Lighthouse: Lighthouse reports on LCP, CLS, and TBT, also highlighting potential performance improvements. Lighthouse is available in the Chrome DevTools, as a Chrome extension, and as an npm package. Lighthouse can also be incorporated into continuous integration workflows through CI headlight.
  • WebPageTest: WebPageTest includes Web Vitals as part of its standard reports. WebPageTest is useful for collecting information about Web Vitals under particular device and network conditions.

R Marketing Digital