PageSpeed Insights

Have you ever been reading an article online when something suddenly changes on the page? Without warning, the text moves and you’ve lost your place. Or worse: you’re about to tap a link or a button, but in the instant before your finger lands, BOOM, the link moves and you end up clicking on something else.

Most of the time these types of experiences are simply annoying, but in some cases, they can cause real damage.

A screencast illustrating how design instability can negatively affect users.

Unexpected movement of page content usually occurs because resources are loaded asynchronously or DOM elements are dynamically added to the page on top of existing content. The culprit could be an image or video with unknown dimensions, a font that appears larger or smaller than your backup, or a third-party ad or widget that dynamically resized.

What makes this problem even more problematic is that the way a site works in development is often quite different from how users experience it. Custom or third-party content often doesn’t behave the same in development as it does in production, test images are often already in the developer’s browser cache, and API calls that run locally are often so fast that the lag isn’t noticeable.

The Cumulative Design Change (CLS) metric helps you address this problem by measuring how often it occurs for real users.

What is CLS

? #CLS is

a measure of the largest burst of design change

scores for each unexpected design change that occurs over the lifetime of a page

.

A layout change occurs whenever a visible element changes its position from one rendered frame to the next. (See below for details on how individual design change scores are calculated.)

A burst of design changes, known as a session window

, is when one or more individual design changes occur in rapid succession with less than 1 second between each turn and a maximum of 5 seconds for the total duration of

the window.

The largest burst is the session window with the maximum cumulative score of all design changes within that window.

Example of session windows. The blue bars represent the scores of each individual design change.

CautionWhat

is

a good CLS score? #

To provide a good user experience, sites should strive to have a CLS score of 0.1 or less. To make sure you’re hitting this goal for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented across mobile and desktop.

Good CLS values are 0.1 or less, poor values are greater than 0.25, and anything in between needs improvement

In the image above there is an element that occupies half of the viewport in a box. Then, in the next box, the element moves down by 25% of the height of the viewport. The dotted red rectangle indicates the union of the visible area of the element in both frames, which, in this case, is 75% of the total viewport, so its impact fraction is 0.75.

Distance fraction

#La

other part of the design displacement score equation measures the distance that unstable elements have moved, relative to the viewport. The distance fraction is the greatest distance that any unstable element has moved in the frame (either horizontally or vertically) divided by the largest dimension of the viewport (width or height, whichever is greater).

Example of impact fraction with a _unstable element_

In the previous example, the largest viewport dimension is height, and the unstable element has moved by 25% of the viewport height, making the distance fraction 0.25.

So in this example, the impact fraction is 0.75 and the distance fraction is 0.25

, so the design change score is 0.75 * 0.25 = 0.1875.

The

following example illustrates how adding content to an existing element affects the layout change score:

Example of distance fraction with a _unstable element_

The “Click me!” button is appended to the bottom of the gray box with black text, which pushes the green box with white text down (and partially out of the viewport).

In this example, the gray box is resized, but its starting position does not change, so it is not an unstable element.

The “Click!” button was not previously in the DOM, so its starting position does not change either.

The initial position of the green box, however, changes, but since it has been partially moved out of the viewport, the invisible area is not considered when calculating the impact fraction. The union of the visible areas for the green box in both frames (illustrated by the dotted red rectangle) is the same as the area of the green box in the first frame: 50% of the viewport. The impact fraction is 0.5.

The distance fraction is illustrated by the purple arrow. The green box has moved down by about 14% of the viewport, so the distance fraction is 0.14.

The design change score is 0.5 x 0.14 = 0.07.

This last example illustrates several unstable elements:

Example of a layout change with multiple stables and _unstable elements_

In the first frame above there are four results of an API request for animals, sorted alphabetically. In the second box, more results are added to the ordered list.

The first item in the list (“Cat”) does not change its starting position between frames, so it is stable. Similarly, new items added to the list were not previously in the DOM, so their initial positions do not change either. But elements labeled “Dog,” “Horse,” and “Zebra” change their starting positions, making them unstable elements.

Again, the

red and dotted rectangles represent the union of these three before and after areas of these three unstable elements, which in this case is about 60% of the viewport area (impact fraction of 0.60).

The arrows represent the distances that the unstable elements have moved from their initial positions. The “Zebra” element, represented by the blue arrow, has moved more, by about 30% of the height of the viewport. That makes the fraction of distance in this example 0.3.

The design change score is 0.60 x 0.3 = 0.18.

Expected vs. unexpected

design changes #

Not all design changes are bad. In fact, many dynamic web applications frequently change the starting position of page elements.

User-initiated design changes

#

A design change is only bad if the user doesn’t expect it. On the other hand, design changes that occur in response to user interactions (clicking a link, pressing a button, typing in a search box, and the like) are generally fine, as long as the change occurs close enough to the interaction that the relationship is clear to the user.

For example, if a user interaction triggers a network request that may take a while to complete, it’s best to create some space immediately and display a load indicator to avoid an unpleasant design change when the request is completed. If the user doesn’t realize something is loading, or doesn’t have an idea of when the resource will be ready, they can try clicking on something else while waiting, something that could come out from under them.

Design changes that occur within 500 milliseconds of user input will have the hadRecentInput flag set, so they can be excluded from calculations.

precaution

Animations and transitions

#Las animations and transitions

,

when done right, are a great way to update page content without surprising the user. Content that changes abruptly and unexpectedly on the page almost always creates a bad user experience. But content that gradually and naturally moves from one position to the next can often help the user better understand what’s happening and guide them through state changes.

Be sure to respect the settings of the preferred reduced-motion browser, as some site visitors may experience harmful effects or attention problems from the animation.

The CSS transform property allows you to animate elements without triggering layout changes

:Instead of changing the height and width properties, use transform:

    scale().

  • To move items, avoid changing the top, right, bottom, or left properties and use transform:translate() instead.

How

to measure CLS #

CLS can be measured in the lab or in the field, and is available in the following tools:

Caution

Field

Tools Chrome User Experience #Informe PageSpeed Insights

  • Search Console (Core Web Vitals report)
  • JavaScript

  • library web-vitals
  • DevTools Lab Tools

  • #Chrome

  • Lighthouse PageSpeed
  • Insights

  • WebPageTest

Measuring design changes in JavaScript #

To measure design changes in JavaScript, use the Design Instability API.

The following example shows how to create a PerformanceObserver to record layout-shift

entries in the console:

Measure

CLS in JavaScript #

To measure CLS in JavaScript, you must group these unexpected design change entries into sessions and calculate the maximum session value. You can refer to the source code for the web vitals JavaScript library that contains a reference implementation of how CLS is calculated.

In most cases, the current CLS value at the time the page

is downloaded is the final CLS value of that page, but there are some important exceptions as noted in the next section. The web vitals JavaScript library takes this into account as much as possible, within the limitations of web APIs.

Differences between metric and API #

  • If a page loads in the background, or if it is in the background before the browser paints any content, then it should not report any CLS values.
  • If a page

  • is restored from the cache backwards/forwards, its CLS value must be reset to zero, as users experience this as a distinct page visit.
  • The API does not report design change entries for changes that occur within iframes, but the metric does, as they are part of the user experience of the page. This can be shown as a difference between CrUX and RUM. To measure CLS correctly you must consider them. Subframes can use the API to report their design change inputs to the main framework for aggregation.

In addition to these exceptions, CLS has some additional complexity due to the fact that it measures the entire lifespan of a page

:

  • users can keep a tab open for a long time: days, weeks, months. In fact, a user may never close a tab.
  • On mobile operating systems, browsers typically do not execute page download callbacks for background tabs, making it difficult to report the “final” value.

To handle these cases, CLS must be reported every time a page is in the background, in addition to every time it is downloaded (the visibilitychange event covers both scenarios). And analytics systems that receive this data will need to calculate the final CLS value on the backend.

Instead of memorizing and dealing with all these cases yourself, developers can use the web-vitals JavaScript library to measure CLS, which explains everything mentioned above:

How to improve

CLS #

A comprehensive guide on CLS optimization is available to guide you through the process of identifying design changes in the field and using lab data to drill down and optimize them.

Additional Resources #

  • Guide Google Publisher Tag on how to minimize design change
  • Understanding Cumulative Layout Shift by Annie Sullivan and Steve Kobes in #PerfMatters (2020)

CHANGELOG #

Occasionally, errors are discovered in the APIs used to measure metrics and sometimes in the definitions of the metrics themselves. As a result, changes sometimes need to be made, and these changes may appear as improvements or regressions in your internal reports and dashboards.

To help you manage this, all changes to the implementation or definition of these metrics will be displayed in this CHANGELOG.

Contact US