To optimize Largest Contentful Paint (LCP), the related page speed metrics should be optimized too, such as First Paint, First Meaningful-Significant Paint, or Time to First Byte.
A page speed metric might not be included in the Core Web Vitals directly, but in an indirect way, they can affect the direct Core Web Vitals signals such as User Experience. Every page speed metric is connected to another one in terms of page loading performance, thus the Largest Contentful Paint optimization in the scope of Core Web Vitals should be evaluated with related page speed metrics and not in isolation
The largest Contentful Paint has 25% weight for the end score of the Lighthouse Pagespeed Test in the v7.
The second most important Lighthouse page speed metric is the Total Blocking Time with another 25% weight, but Total Blocking Time (TBT) is not included in the Core Web Vitals since it can’t be measured in page speed tests in a proper way. Total Blocking Time can be understood correctly via Real User Metrics (RUM), not with the lab data. Thus, unlike Largest Contentful Paint, it is not included in the Core Web Vitals. In this context, Largest Contentful Paint is included in the Page Experience Algorithm within the scope of Core Web Vitals (CWV), not just because it is effective, but also because it can be measured without the need of RUM. For a Search Engine, it is easier to measure, but also it has a contextual functionality for evaluating the web page.
Largest Contentful Paint is important for the initial contact section of a web page, and above the fold section of a document. The largest visible element of the initial contact section of a webpage also identifies the web page’s purpose, characteristics, and relevance for certain queries, or industries. So, in other words, LCP is a metric for page speed, but Largest Content (LC) is also a contextual signal for the search engines.
Also, the initial contact section of the web page communicates with the users to define the purpose of the web page. To understand the importance of web page layout and its meanings, signals for search engines you can read the related documentation. And, a slow LCP might dilute the context of the web page layout or the initial contact section for users’ perception, and indirectly search engines.
To determine the LCP Element of the web page, the browser tries to choose the biggest “HTML div” element. But, also visual completeness and being visible are the most vital factors to determine the Largest Contentful Paint element. Knowing how the LCP Element has been chosen is helpful for optimizing LCP and understanding the browser’s working principles.
- If an HTML Div element is selected as the LCP Element for a long time while the web page is loading, the browser may not be able to select a larger HTML Div Element that is loaded later as the LCP Element.
- An HTML Div Element that contains the three major HTML Div Elements cannot be an LCP Element versus a single element as big as one of each of the three parts.
- LCP Elements with an Image can be changed based on the image resizing during the web page loading. If the image is smaller than the other candidates, the LCP HTML Element can change too.
- If the Image Elements are resized, the smallest size will be considered for LCP Determination.
LCP Elements can be chosen among the HTML Element types below.
- Image Elements
- Text Elements
- Video Poster Images
- HTML Elements with CSS Background Images
In the future, video and SVG elements will be accounted for in the LCP determination.
To optimize the Largest Contentful Paint, the methods below can be used.
- Use Critical CSS.
- Shorten the Critical Rendering Path.
- Use better technologies for server-side compressions, such as Brotli.
- Optimize your TCP Slow Start.
- Use Asynchronous CSS.
- Defer all of the non-content JS.
- Decrease the Layer Count to be rendered.
- Decrease the DOM Size.
- Use Shadow DOM.
- Use Service Worker.
- Optimize the HTTP Cache Strategy by using Browser-side Caching.
- Use Varnish-like layers.
- Clean all the unused CSS and JS codes.
- Reduce the Request Count by creating bundles.
- Use font-variables.
- Delay the rendering of the below the fold.
- Use HTTP 2.1 or HTTP QUIC for better RTT.
- Compress all of the images by cleaning unnecessary pixels.
- Compress and Minify the HTML, CSS, JS.
- Use AVIF, WebP, or JPEG XL for images.
- Compress videos, and do not use GIF.
- Use Adaptive Serving for the slow connections.
- Use browser hints such as preload, or preconnect.
- Decrease the DNS Resolution need by decreasing the size of the resource map.
The main reasons for the slow LCP are listed below.
- Slow server connection.
- An excessive DOM Size.
- Unnecessarily big and prioritized non-content-related third-party scripts.
- Bad resource load order.
The largest Contentful Paint (LCP) has been announced at the end of 2018 within the Github Lighthouse repository. But, it was not an official announcement. It has been used during the development process of these new concepts of page speed.
The next sentences will include opinions about the process of the LCP announcement.
In 2019, Bartosz Goralewicz published a video and announced that LCP will change page speed optimization in the future. I was one of the first people who watched this video. And, I must thank him for publishing this awesome and brief video. During the coming weeks, I have researched the Lighthouse repository of Google on Github to learn more, and I have written an article called Advanced Page Speed Metrics, because “Core Web Vitals” was not a concept or term at this time. I think this was one of the first articles about Largest Contentful Paint and other new and modern page speed metrics.
LCP Thresholds are listed below.
- Below 2.5 seconds is fast.
- Between 2.5 seconds and 4 seconds is average.
- Above 4 seconds is bad.
To have a passing score for the LCP, 75% of the users should experience a faster LCP than 2.5 seconds in the Chrome User Experience report.
Despite every page speed optimization methodology, if the LCP couldn’t be optimized, changing the LCP content can help. From an image to a text, or to another image the LCP scores can be optimized. And, sometimes changing the design, the layout, or removing some unnecessary web page weights can help to optimize LCP. So, if you can’t optimize LCP for an LCP Element, you can change the element itself to optimize it.
First Input Delay (FID) is related to the load responsiveness. FID measures the time difference between the input of the user and the event that happens after the input. FID is important to diagnose the web pages that are not responsive during the web page loading process. A web page can be loaded fast, but still, it might not respond to the user’s interaction. In this context, loading the content fast wouldn’t have a meaningful impact on the user experience.
A representation of the first input delay
During the web page loading process, many CPU bottlenecks can happen. If there is no proper compression, minifying, or resource load order optimization for a web page, the main thread of the browser can have bottlenecks. This situation prevents the web page from responding to user’s interactions.
If there are too many long tasks (tasks that are longer than 40ms), the first input delay can happen more. And, if the long tasks happen frequently during the web page loading process, the FID can happen continuously, and the web page might not be able to satisfy the visit intent of the user.
Note: The event itself is not included in the First Input Delay measurement. Only the input and reaction time difference is included for the FID measurement.
Even if there is no FID data within the lab tests, you can check the Long Tasks from a web page speed and loading performance test.
If there is no input on the web page, there is no measurement for the FID. Every input doesn’t create an input latency or input delay. And sometimes, users do not interact with the web page in terms of input events. In this context, if there is no input event, there is no input delay, and there is no FID Measurement.
Note: The sessions without an FID Measurement won’t be included in the FID Score calculation.
The related page speed metrics for FID are listed below.
- Time To Interactive
- Total Blocking Time
Time to Interactive measures when the web page is fully responsive for any interaction from the users while Total Blocking Time is the sum of the Long Tasks during the web page loading process. TBT, FID, and TTI are the page speed metrics for the loading responsiveness and runtime responsiveness.
The thresholds for the FID are below.
- Below 100 MS is good.
- Between 100 MS and 300 MS are average.
- Over the 300 MS are bad.
To pass the FID Score threshold, 75% of the users should have a good FID experience in the context of Core Web Vitals.
The input elements are listed below.
Note: Scrolling and zooming are not measured in the FID. Because scrolling and zooming are not input events.
Methods to optimize First Input Delay have been listed below.
- Try to use advanced module bundlers such as Webpack.
- Decrease the third-party script count.
- Optimize the resource load order for giving the main thread idle times so that it can respond to the user.
- Optimize the server response timing
- Implement code-splitting for the assets.
FID is an important user-centric web page loading performance metric because it measures the responsiveness of the web page. Every query has a need behind it. Every click on the SERP has a purpose. And, to satisfy the needs behind the query, and clicks on the SERP, a web page should be able to respond quickly. Even if the page is fast, the content is high quality and the source is reliable, it all counts for little if the web page cannot be used properly, as it cannot satisfy users.
FID is insurance for the Search Engines to keep the SERP full of high quality pages that can respond to the users. And, if a web page cannot respond to the users, its quality or reliability can be hindered in the eyes of the search engine, since it is not as useful to a user as it could be.
Paul Irish from Chrome Dev Summit 2018 while showing the concept of First Input Delay
Cumulative Layout Shift is to measure the layout shift of the web page elements during the lifespan of the webpage and a user’s session. Cumulative Layout shift is a user-centric web page loading performance metric that focuses on visual stability. And, any moving web page element after being loaded and being visible will be included in the Cumulative Layout Shift (CLS) measurement.
Cumulative Layout Shift can be measured by the Layout Installability API. Below, you can find the CLS Formula for measuring it.
layout shift score = impact fraction * distance fraction
To understand how to measure Cumulative Layout Shift, the Impact Fraction and the Distance Fraction terms should be defined.
A representation of the Cumulative Layout Shift
CLS is a way to measure layout instability by looking at how much of a visual area of the screen is impacted as the page changes when loading.
It does this in 3 stages; firstly by measuring all elements that shift on the page; and secondly by measuring the Impact Region (comparing the visual area before and after) and finally by comparing this to the viewport.
Impact Region / Viewport = Impact Fraction
So, Impact Fraction measures how much the page moves in relation to the size of the screen that the website is being loaded on. This is important as a big movement on a small screen is far worse than a small movement on a big screen.
As an example calculation for the Impact Fraction, if a web page element covers 30% of the screen, after the layout shift event, the web page element’s new position and old position will be unified, and the screen coverage ratio will be measured again. In this context, the new and the old place for the same web page element can have 60% of the screen at the top. And, it means that the impact fraction is 0.75.
In this context, because of how big the web page element is, the layout shift will create greater Impact Fraction.
Impact Fraction Example from Google Devs
Distance fraction is not related to the size of the web page element or the area that it covers, but it is related to the distance from the web page element to the viewport’s edges according to its biggest dimension. For instance, if a web page element is higher than its width, its layout shift event’s distance fraction will be measured based on the vertical distance.
As an example calculation for the distance fraction, if a web page element is at the top section of the viewport, the web page element’s height is greater than its width, and if it moves 30% to down during a layout shift event, its’ Distance Fraction score will be 0.30.
In this context, if the Impact Fraction isr 0.75 and the Distance Fraction is 0.30, the layout shift score will be calculated as below.
Layout shift score 0.225 = 0.30 (Distance Fraction) * 0.75 (Impact Fraction)
Cumulative Layout Shift Score is the sum of all of the layout shift scores.
Distance Fraction example from Google Devs
The Cumulative Layout Shift Thresholds are listed below.
- Below 0.1 is good.
- Between 01 and 0.25 is average.
- Above 0.25 is bad.
Note: Cumulative Layout Shift can happen in the direction of vertical moves or horizontal moves.
Cumulative Layout Shift Thresholds from Google
No, not every layout shift is not included in the CLS measurement. Only the unexpected layout shift events will be included in the CLS measurement. If there is a layout shift based on expected user behavior such as clicking, or hovering, it is an expected web page behaviour for the user interaction and therefore it doesn’t bother the user.
Thus, Layout Shift events can be categorized into two sections as unconditional layout shifts and conditional layout shifts. The conditional layout shifts happen after a certain user interaction within the scope of acceptable standard web behaviours. Even if a page speed measurement tool says that a conditional layout shift has been included in the CLS, the developer and the SEO should know that it is not a problem as long as it makes sense for the UX.
This is a conditional and expected CLS example
The reasons for CLS are listed below.
- Dynamic content injection.
- Ads that are loaded without explicit height or width.
- Browser image resizing
- FOUT (Flash of unstyled text) and FOIT (Flash of invisible text)
- Using page elements without explicit dimensions.
To improve the CLS score, the following methods below can be used.
- Use Font-face CSS Property with “optional” value to solve the FOIT.
- Preload the FONT Files to solve the FOIT.
- Use absolute width and height values for the ads as wrappers.
- Do not resize images on the browser side.
- Do not inject content dynamically.
- Try to improve the Layout Process of the web page.
To improve CLS you can check the Cumulative Layout Shift Guide.
To observe Cumulative Layout Shift, you can use Google Chrome’s Performance Tab and Experience sub-tab. When you click on the Experience tab, it will show all the Layout Shift, and Long Task or Input events. And, if you click one of the Layout Shift events, you will see that it shows the layout event’s position changes and it overlays the web page element that has shifted.
To observe the Layout Shift during the web page loading process, follow the steps below.
- Open the Chrome DevTools.
- Switch to the Performance Tab.
- Record a performance profile.
- Choose the experience tab.
- Examine the layout shift events.
Cumulative Layout Shift is important because it shows the visual stability of the web page elements during the web page loading process. CLS is not related to the page speed, but it is about the web page element’s position changes. Layout shift events can create stress for the user by resulting in the wrong and/or unwanted clicks or events.
A high cumulative layout shift score signals to a search engine that even if the web page is fast, quality, reliable and responsive during the loading process, it can create stress and harm the users. Thus, a high and bad CLS score could result in a search engine ranking drop.
Hypertext Transfer Protocol Secure (HTTPS) is the secure version of HTTP. HTTPS provides a secure and encrypted data transfer. HTTPS is important to transfer data in a secure way, especially for e-Commerce sites, email services, banks and insurance providers. If a website uses login credentials, it should definitely use HTTPS. There are different types of SSL Certificates such as Company Level HTTPS, or regular SSL Certificates. On 7 August 2014, Google published a blog post and made it clear that HTTPS was a ranking signal.
Google also published a new tool for safe-browsing. And, if a website includes malicious code or malware, Google Chrome can flag the website, and Google can remove it from the SERP.
Google has many products, but every product of Google represents its vision that every SEO should follow, learn and understand.
TLS or Transport Layer Security is the formerly known Secure Sockets Layer (SSL). TLS ensures that the communication between the server and the user can happen via Private and Public Keys. The Public Key is open to anyone, and it is used to encrypt the information on the server, and the private key is controlled by the server, it is used to decrypt the information that has been encrypted by the public key.
To check the TLS Version for a website, you can follow the steps below.
- Open the Chrome Devtools.
- Choose More Tools.
- Choose the Security Tab.
- Check the Connection Section.
Below you can see an example of TLS version check with Google Chrome.
HTTPS is important for secure and private browsing. HTTPS was announced by Google using the slogan of “HTTPS Everywhere”. As a ranking signal, HTTPS is increasing its importance every year. Using a fast and secure HTTPS will help websites to thrive on the SERP. IT’s table stakes for all websites.
If a web page contains high quality content and is trustworthy, fast, responsive during the loading process and visually stable but it is not safe, Google won’t consider the web page as a proper candidate for the given query.
To pass the Core Web Vitals assessments, websites should be mobile-friendly. A mobile-friendly design has three main components.
- Responsiveness for different viewports.
- Everything is crawlable.
- A great user experience.
A web page can change its elements’ sizes according to the viewport width of the device or the browser. Responsive design is helpful to keep the content the same on different devices. To keep a web page responsive, different CSS properties can be used, such as “media queries” or “clamp”. But in terms of page speed, and usability, using a single line of CSS code for all of the layout is more useful in terms of visual responsiveness.
Responsive Web Design Announcement and Guideline from Google
To keep a website responsive, knowing the most used screen widths are important. The most used screen widths for desktop devices are below.
- 1920×1080 (19.57%)
- 1366×768 (14.88%)
- 1440×900 (9.18%)
- 1536×864 (7.22%)
- 1024×768 (4.73%)
The most used screen widths for mobile devices are below.
- 414×896 (19.44%)
- 375×667 (13.67%)
- 375×812 (12.3%)
- 414×736 (8.91%)
- 360×640 (8.21%)
Data Source: Statcounter
An SEO, UX Expert, or Developer should test all of these screen widths for responsiveness to pass the Core Web Vitals Assessments.
Lastly, responsive design is not just about viewport width! I recommend you to watch Una Kravets from Google I/O 2021 Event.
The common mobile design and mobile-friendliness mistakes are listed below.
- Using interstitials.
- Using adaptive serving.
- Using different URLs for different devices.
- Using different content for different devices.
- Using different links for different devices.
- Keeping the clickable elements too close to each other.
- Using a fixed viewport for designing the website.
- Small font size.
- Redirecting a user to another page version according to the user-agent.
- Using non complying image and video extensions for mobile devices.
- Blocking the JS, CSS, Font, and Image files.
To test a web page’s mobile friendliness, the tools below can be used.
- Google Search Console URL Inspection Tool
- Google Mobile-Friendly Test Tool
- Bing Mobile Friendliness Test Tool
- Yandex Mobile-Compatibility Test Tool
In the mobile-friendly test tools, a search engine can show a webmaster how it perceives a web page, and what it could crawl and digest from the webpage. Also, Google Analytics, or Adobe Analytics data can be used to examine the mobile-friendliness of a web page, by analyzing the audiences. Hotjar or Clarity can be used to record and watch the users’ sessions.
A Mobile-friendly design example from the Search Engine
The mobile user and mobile traffic percentage change from year to year, but in 2018 desktop user and traffic percentage have been exceeded by the mobile user percentage. Today, mobile users and traffic percentage are 54%, according to Statistica. Based on Perficient’s data, the mobile user percentage is 68.1%. But, due to the pandemic conditions, the desktop user and traffic percentage have increased.
Google has switched to the mobile crawling scheme for all of the websites. In 2018, Google announced mobile-first indexing. In this concept, Google has given more weight to the mobile version of the web pages for ranking. In 2021, Google has switched to mobile-only indexing, in other words, if the content doesn’t exist within the mobile version, it won’t be evaluated by Google.
Mobile-friendliness is important because most of the users are using mobile devices for browsing on the web. And, if a website is not designed with the mobile-first design mentality, it might create a frustrating experience for most of the users. Even if a web page has good, quality content, and it is secure, fast, responsive during the loading, and visually stable, if it is not usable by the users, it won’t be considered as a quality web page by the Page Experience algorithm in the context of Core Web Vitals.
To pass the core web vitals assessments, proper monetization is important. In 2021, Google started to publish “Monetization Policies” videos on YouTube. Monetization Policies include “invalid clicks and impressions”, “encouraging clicks or views from users”, “abusive experiences” and more. Most of these monetization policies are about ad publishers’ mentality and behaviour on the web. In the context of the Core Web Vitals, the proper ad experience for the users is important. If you deceive users, then Google will discover this and penalise you.
To use ads in a user-friendly way, ensure the ads do not cover all of the above-the-fold section of the page. Every screen of a web page during the scrolling shouldn’t have more ads than 30% of the web page. To keep the ad relevant to the web page, a website shouldn’t use interstitials or too many repetitive ads that are irrelevant to the topic.
Natural ad placement is important to keep a web page profitable and also user-friendly. In fact, banner blindness can happen in the case of an excessive amount of ad usage.
Against aggressive monetization, Google has published the Page Layout Algorithm. The page layout algorithm of Google especially targets the excessive amount of ads on the above-the-fold section of the web pages.
And, today we have the Page Experience Algorithm, it is similar to the Page Layout Algorithm in the context of monetization. Google has improved its concepts and algorithms’ configuration to differentiate between the excessive amount of CTA Elements or Banners within a web page.
An announcement from Matt Cutts for the Page Layout Algorithm from 2012
If a user can’t see the main content of the web page because of the excessive amount of ads, it means that the web page can’t perform its purpose in the context of the search intent of the user. If a web page doesn’t let the user interact with the main content by displaying a pop-up, or irrelevant ads, forms, and CTAs, it means that the web page is abusing the user and the search engine’s decision tree.
An excessive amount of ads on a web page can oppress the context of the main content, this situation can create a contextual dilution, and it can make a Search Engine think that the web page is actually about something else. Thus, using relevant ads with optimized placement is important to keep things consolidated and congruent.
Page Layout Algorithm Announcement from Google in 2012
In the past, Google has published a tool to audit the web page layout to help webmasters to organize their web page layout for the benefit of the user experience in the context of ad placement. This tool basically says users prefer fewer ads and a greater distance between ads.
If a web page includes an ad for a harmful industry, brand, or non-reliable product or organisation, this situation can harm the web page’s performance. Ads are part of the web page’s content. They should be secure, logical, honest, and legitimate. In the Core Web Vitals assessments, you should also care about the ads’ content, along with their impact on the user experience.
In the context of Core Web Vitals, proper monetization is important because an ad can harm a user with its content or its effect on the web page layout. It can suppress the main content and prevent users from focusing on the main search intent. It can distract the audience, and delay their interaction with the web page.
Even if a web page is secure, fast, visually stable, responsive during the loading process and mobile-friendly, if the web page has a harmful ad or bad ad placements, the web page won’t let users perform their purpose during their session. Thus, a web page with bad monetization practices can be dropped from rankings in the context of Core Web Vitals.
Google’s Monetization Policy Starting for Content Publishers.
Core Web Vitals is the conceptualized version of a Good Page Experience. A secure, fast, relevant, and responsive web page can satisfy a user, and Google tries to improve the SERP Quality with this conceptualization and Page Experience Algorithm. Also, Core Web Vitals heavily focus on web pages’ speed, and a faster web page means also a cheaper web page to be crawled. In both ways, Google improves its internal economics and SERP quality.
The Page Experience Algorithm is likely to change over time. Its content, ranking signals, the weight of these signals over time, and all of these page speed metrics, or security measurements can change their definitions or measurement methodologies, so as SEOs we should all follow the state-of-art level information to keep SEO Projects ready for these future changes.
Finally, I’m passionate about Core Web Vitals as they are so closely connected to my day-to-day work at Holistic SEO, as it’s not just about page speed, it is related to the different verticals of SEO such as coding, user experience and security. And, anything that relates to the SERP and rankings is under the umbrella of SEO.
If you’ve got this far then you must also share my passion for technical SEO. If so, come and join me and other fellow optimisers in our community for a chat about all things Core Web Vitals and tech SEO.
You can also reach me on email Koray Tuğberk GÜBÜR. I am happy to have contributed to the Authoritas’ Technical SEO Guide, and having my name side-by-side with the dear, departed Hamlet Batista who inspired me.