Core Web Vitals “No Data”: What It Is and Next Steps

Last updated on Mar 15th, 2024 | 9 min

Checking your site’s Core Web Vitals in PageSpeed Insights or Search Console and seeing a blank section instead of colorful charts is like hitting a brick wall.

Search console no data

That’s because Google’s Core Web Vitals have become essential for site owners who don’t want to guess how users experience a website but rather have the numbers to back it up.

And more specifically, leveraging data about crucial moments, like:

On March 12, 2024, FID was officially replaced by a new responsiveness metric – Interaction to Next Paint (INP). It aims to provide a more comprehensive view of how responsive your web pages are by measuring the longest interaction in the entire session.


Moreover, page experience is officially a ranking signal in Google Search, so a passed Core Web Vitals assessment not only puts you in front of more people but also helps you engage and convert them better and faster.

Google PSI report Core Web Vital assessment passed

So, to what extent does the missing Core Web Vitals data affect your online business? To solve this riddle, you first need to understand the methodology behind Core Web Vitals data sourcing.
 

How is Core Web Vitals data sourced?

Google primarily relies on two sources for collecting this valuable data: the Chrome User Experience (CrUX) Report and Lighthouse audits. These sources offer insights into what website owners can do to enhance the user experience further.

CrUX Report vs Lighthouse

The CrUX (Chrome User Experience) report is a rich source of real-world user experience data. It collects field data from millions of Chrome users as they navigate the web. This extensive dataset encompasses a wide range of over 16 million origins, making it a valuable resource for understanding the broader web performance landscape.

In contrast, Lighthouse is an open-source tool developed by Google used to conduct lab tests of web performance. It simulates user interactions in a controlled environment and provides detailed performance metrics.
 

Field Data vs Lab Data

Both field and lab data are presented in your Google PageSpeed Insights report.

Field data is derived from real users' experiences as they visit your website in their daily online activities. This data reflects users' actual performance, offering a genuine assessment of a website's user experience.

Represented by the Core Web Vitals assessment in your report, the missing field data is why you’re reading this article.

Google PSI report Core Web Vitals passed

Lab data is generated in controlled test environments. While it allows website owners to identify and address specific performance bottlenecks, it doesn't capture the variations and nuances of real-world usage.

PSI report lab-based metrics
 

Pros and Cons of Field Data

Field data, sourced from the CrUX Report, offers several advantages and some limitations.

A significant advantage is its authenticity. Since it represents actual user experiences, it provides a realistic view of a website's performance from a user's perspective. This can be invaluable for identifying critical issues that impact user satisfaction.

Failing your Core Web Vitals assessment is a tell-tale sign you need to focus your attention on your site’s performance if you want to leverage benefits like:

  • 8.6% more pages viewed in a session
  • 5.2% improvement in customer engagement
  • 8.4% more conversions
  • 9.2% average order value (AOV) increase

See the Before & After of powerful speed optimization. Test your website with NitroPack for free →

The downsides of field data on Core Web Vitals include:

  • Possibility of no aggregated data (duh)
  • Insufficient granularity to pinpoint root causes of performance issues (unless paired with further analysis)
  • Small window for optimization since Core Web Vitals results are updated every 28 days

Nonetheless, the benefits of analyzing field data and optimizing Core Web Vitals for your business outweigh the drawbacks significantly.
 

Why Is Core Web Vitals Field Data Not Available for My Website?

If you see no data for your Core Web Vitals in Google Search Console, it might be that your property is new, and the console is still checking the CrUX database.

Not your case? Well, let’s probe deeper.

Clicking on the tooltip next to the “No data available” message in your Google Search Console or Google PSI report reveals the following:

“The Chrome User Experience Report does not have sufficient real-world speed data for this page.”

Simply put, you see no field data because your website hasn’t generated enough traffic on desktop and/or mobile. It’s always worth checking both instances as they are sourced separately.

So, you might be thinking that growing your website traffic should fix the issue, right?

It’s not quite so simple.

The CrUX report aggregates real-world speed data for origins following several essential requirements:

  • Users are opted-in to syncing their browsing history, have not set up a Sync passphrase, and have usage statistic reporting enabled;
  • Your site’s URLs are public (crawlable and indexable);
  • Your website is sufficiently popular (it has a minimum number of visitors across all of its pages) with distinct samples that provide a representative, anonymized view of the performance of the URL or origin.

Back in 2021, Martin Splitt from Google further clarified:

“...It can be enough visitors, if these visitors are not generating telemetry data, then we are still not having the telemetry data.

And even if we have some data, it might not be enough for us to confidently say that this is the data we think represents the actual signal. So we might decide to actually not have a signal for that if the data source is too flaky or if the data is too noisy.

… More traffic is more likely to generate data quickly, but it’s not a guarantee.”

 

So much for hoping for specific numbers.

You should also consider that a website may never become part of the CrUX dataset. When you think about it, CrUX tracks 16 million origins. Seems a lot, right?

However, when compared to the 1.13 billion websites on the Internet today, the CrUX dataset is but a small fraction.

To summarize:

  • Google Search Console might need more time to produce a Core Web Vitals report for new properties (if your website appears in the CrUX report)
  • Brand-new websites with little to no traffic have the lowest chance of entering the CrUX dataset
  • Websites must cover specific requirements regarding users and URL discoverability to become eligible for the CrUX report 
  • Pages and origins that do not meet the popularity threshold are not included in the CrUX dataset

While Google can’t guarantee your website will enter the CrUX dataset so you can analyze your Core Web Vitals based on field data, it doesn’t mean your hands are tied.

Optimize page speed and improve user experience with a single tool. Get started with NitroPack today →

How to Improve Core Web Vitals and Performance Without Field Data

Until the CrUX report returns readable data, you can focus on alternative methods like monitoring other performance, server, and network metrics, performance auditing with GTmetrix, and analyzing user feedback and behavior.

Find a bonus tip most site owners don’t leverage at the end ;)

1. Monitor Lab Performance Metrics in Google PageSpeed Insights

When field data is missing, your next best move is to scroll down in your Google PSI report and start with the lab-based equivalents of Largest Contentul Paint (LCP) and Cumulative Layout Shift (CLS). Since Interaction to Next Paint (INP) doesn’t have a lab-based equivalent, Total Blocking Time is another metric to focus on, along with First Contentful Paint (FCP) and Speed Index (SI).

PSI Report lab based metrics
 

  • Total Blocking Time (TBT): TBT measures the total amount of time during which the main thread of a web page is blocked and unable to respond to user input. It helps identify and address issues that can affect interactivity, such as delayed clicks or keyboard input. To provide a smooth user experience, TBT should be kept under 300 milliseconds (ms).

To reduce TBT, you can:

Minimize or defer non-essential JavaScript;

— Optimize and limit the use of third-party scripts;

— Utilize web workers to offload heavy tasks;

— Implement asynchronous loading for scripts.

 

  • First Contentful Paint (FCP): FCP measures the time it takes for the first piece of content to appear on a web page when it starts loading. It's a critical user-centric metric because it indicates when users first see something happening on your page. For a good user experience, FCP should typically occur within 1 to 2 seconds of the page starting to load.
     

To improve First Contentful Paint, you should:

— Reduce server response times;

Minimize render-blocking resources;

— Use lazy loading for non-essential resources;

Reduce JavaScript execution time.

FCP

 

  • Speed Index: This metric quantifies how quickly the contents of a web page are visibly populated. A lower speed index indicates faster page load times, and you should aim for a Speed Index score of less than 1,000.

To improve your site’s Speed Index:

Optimize and compress images and other media files;

— Minimize the use of large, above-the-fold images;

— Implement code-splitting to load only necessary JavaScript on the initial page load.

Get a 90+ performance score in Google PSI. NitroPack applies 35+ optimizations on autopilot for you →

2. Run Performance Analysis with GTmetrix

GTmetrix provides a more extensive set of performance metrics and customization options that will help you build a better optimization strategy.

Timestrip Gtmetrix example
 

  • Time to First Byte (TTFB): TTFB measures the time it takes for the browser to receive the first byte of data from the web server after making an HTTP request. It's a crucial metric because it reflects server response times, including DNS resolution, server processing, and network latency. For a good user experience, aim for a TTFB of under 100 to 200 milliseconds.

To reduce TTFB:

— Optimize server and database performance;

Use content delivery networks (CDNs);

— Minimize the number of HTTP requests;

Implement browser caching for frequently requested resources.

 

  • Time to Interactive (TTI): TTI measures when a web page becomes fully interactive and responsive to user input, i.e. when a page is ready for users to engage with it. TTI should ideally occur within 3 to 5 seconds to provide a seamless user experience.

Generally, when implementing techniques to improve TBT, you should see significant improvements in TTI as well.

 

Resource Loading Metrics (Waterfall): These metrics encompass the load times of specific resources such as images, stylesheets, fonts, and scripts. Monitoring these in a waterfall chart helps identify bottlenecks in the loading sequence.

GTmetrix waterfall chart

While there are no specific thresholds, aim to minimize the load times of critical resources that appear above the fold to achieve an overall faster page load.

To improve resource loading times:

— Compress images and use modern image formats like WebP;

— Optimize and consolidate CSS and JavaScript files;

Speed up resource loading with priority hints, fetchpriority, and link=rel_preload

 

  • Onload Time: Onload Time marks the point when all page resources, including images and scripts, are loaded. Aim for an onload time of up to 3 seconds for an optimal user experience. Onload time is impacted by your optimization efforts across all other metrics we’ve discussed so far and reflects how well you’re doing.

 

  • Fully Loaded Time: Fully Loaded Time measures the complete loading process, i.e., when all resources on a web page, including images, scripts, and external content, have finished loading. Similar to Onload Time, it’s a sum of all other preceding metrics and how well-optimized they are.

Lazy loading, WebP conversion, built-in CDN, caching, and more! Find everything you need for faster load times in NitroPack →

3. Pay Attention to Server and Network Metrics

  • Server Metrics: Server-side metrics like CPU usage, memory usage, and server response times provide insights into the health and performance of your hosting infrastructure. These metrics are vital for understanding how efficiently your server handles incoming requests and processes data.

    Improvements like server code and script optimization, efficient algorithms and caching mechanisms, and scaling your hosting infrastructure will help reduce CPU usage. Regular server configuration and application optimizations will minimize memory consumption.

     
  • Network Metrics: Network metrics are related to the performance of data transmission over networks, including metrics like round-trip time (RTT). They help diagnose issues related to server location, network latency, and data transfer efficiency.

    Choose hosting providers with data centers closer to your target audience, implement content caching, optimize your site’s assets, and invest in a CDN provider to reduce network latency and improve data transfer efficiency

Optimize resource usage and eliminate server overhead with NitroPack all-in-one speed optimizations →

4. Analyze User Feedback and Behavior

  • Surveys and Feedback Forms: Create user-friendly surveys and feedback forms to collect structured feedback. Ask specific questions about the user experience, satisfaction, and pain points. Use tools like Google Forms, SurveyMonkey, or dedicated website feedback tools and plugins.
  • Heatmaps: Use heat mapping tools like Hotjar and Crazy Egg to visualize user interactions. Identify where users click, move their cursors, or spend the most time on your site. Heatmaps reveal popular and problematic areas on web pages.
  • Session Recording: Record user sessions to see how visitors navigate and interact with your website. Watch recordings to identify usability issues, confusion, or points of frustration. Tools like FullStory offer session recording features.
  • Customer Support Interactions: Your customer support agents are often the first-point contact and the source of invaluable insight. Review CS interactions, including emails, chats, and phone calls to identify recurring issues, user complaints, and common questions.
     

5. Bonus: Start Your First Web Performance Budget

A web performance budget is a predetermined limit on various performance metrics that your website should adhere to. These metrics can include load times, page size, the number of HTTP requests, and more. The budget serves as a benchmark, setting clear boundaries for how your website should perform to ensure an optimal user experience.

Here are a few simple steps to help you get started with your first web performance budget:

  1. Define Key Metrics: Identify the performance metrics that matter most for your website (start with any of the lab metrics we’ve discussed so far)
  2. Set Benchmarks: Start by listing down your current results and aim to lower them to meet your users' expectations and industry standards.
  3. Monitor Regularly: Use Google PSI and GTmetrix to regularly measure and track your website's performance against your established budget.
  4. Optimize Effectively: If your website exceeds the budgeted thresholds, use the techniques we shared earlier or check the Diagnostics section in PageSpeed Insights.

 

Optimizing Site Performance with NitroPack

With or without field data, the quest for a faster, more responsive user experience remains a journey worth taking.

You don’t have to go it alone, though. 180K+ site owners like you delegate performance optimization to the most comprehensive tool on the market – NitroPack.

With advanced features that work on autopilot, you can have optimized images, code, and fonts to offer a lightning-fast user experience and grow your business sustainably.

Lora Raykova
Web Performance Buff

Lora has 7+ years of experience developing in-depth, specialized content for SaaS companies in the CEE region. She has sourced and collaborated with subject-matter experts on site speed optimization for WordPress, Core Web Vitals, and the future of web performance. "If you can't explain it to a 7-year-old, you've not researched it well enough" is Lora's guiding principle when publishing professional insight for complex technical topics.