Google PageSpeed ​​Insights has been radically updated, what will change? PageSpeed ​​Insights: Load Speed ​​as a Ranking Factor Estimated input latency.

Hello, dear readers of the blog site. Today I want to talk to you about such an important topic as website loading speed. You've probably already heard that, along with many other factors influencing the promotion of a Web project, search engines have recently begun to take this into account as well. And the brakes on the site really irritate visitors, especially those who are used to it.

But nevertheless, for many webmasters, loading speed is not a topic that you should start racking your brains over immediately; usually it is put off for later, and pressing problems related to filling the resource with unique materials and improving it with internal optimization come to the fore. , but the entire set of factors influencing. As they say, a chain is only as strong as its weakest link...

Online services for measuring website loading speed

In any case, this is how I felt until recently about trying to speed up my resource. But it occurred to me that it is better not to wait for unpleasant trends (namely, a decrease in traffic from , because it is more critical of loading speed), but to correct the obvious shortcoming right now.

You just need to realize that the speed with which your website, blog or forum loads is a very important indicator. If your project is not doing well with this indicator, then this can lead to quite unpleasant consequences. And the most important of them is that visitors may refuse to work with your resource, because... It takes a very long time to load pages. In addition, search engines, especially Google, take it into account when assessing the overall usefulness of a particular resource.

In order to understand how quickly the pages of your resource load, you can use, for example, the capabilities of this, described in detail in the article at the link provided. Or you can use the services that are designed for this. Below I will give a few examples of these.

  • Pingdom - here you just need to indicate the address of the page whose loading speed you want to measure (this does not have to be the main page, because the loading speed of internal pages is no less, and often even more important).

    As a result, a diagram of the download time of individual web page objects will be presented, as well as their URL and weight. The fewer objects are loaded and the lighter each of them weighs, the better. For example, for this purpose I:

  • combined some pictures from the site's theme into
  • and also, if possible, reduced by
  • other than that it makes sense
  • Actually, you can track some problems and the history of measuring the speed of your site on the adjacent tabs of the top menu of Pingdom.

    In a pop-up window, you will be prompted to copy the permanent link to the completed test, and also send it by E-mail or Twitter. You can also subscribe to for availability at the bottom of the window. If your resource goes down (becomes unavailable to visitors), then you will be sent a message by E-mail or SMS to your cell phone. But this service is paid, although there is the possibility of a free trial.

    For your site, as well as to view Traceroute, you will need to select the “Ping and Traceroute” tab at the very top of the page. Enter the URL without http into the form provided, check the “Traceroute” or “Ping” checkbox under this form, and click “Test now”.

  • WebPageTest - as usual, enter the URL of the page you are testing (not necessarily the main one). The service takes some time to calculate the loading speed of all site elements, after which it produces a very visual diagram (more precisely, even two - for the first pass and for the second, when some of the site elements are loaded from the browser cache):

    In the first diagram, pay attention to the position of the purple vertical line - this will be site rendering completion time. The second vertical line (blue) will mean full load time. It’s good if the first line is at 1-1.5 seconds of site loading, and the second line is before 4 seconds. Then the following paragraphs can be read “for reference”. If the site takes longer than 4 seconds to load, then you need to worry about correcting this situation.

  • Google PageSpeed ​​Insights is a tool for developers from Google itself. It gives an assessment of the loading speed of your site (or rather, the optimization of this speed) on a scale of one hundred. 100 is an ideal that is unattainable, but getting 80-90 is quite possible, especially since the service gives very detailed recommendations for correcting identified shortcomings.

    As you can see from the above screenshot, Google PageSpeed ​​Insights gives a comprehensive assessment - for a computer browser and for a mobile one. In addition, below you will find an assessment of the ease of use of your resource on various gadgets. If you haven’t bothered yet, then the score there will be very low (and on the screenshot of your website on the right in the smartphone window, everything will be clearly visible).

    But the most important thing is that Google PageSpeed ​​Insights gives recommendations on how to increase your site's score, i.e. how to speed it up. You need to start, naturally, from the very top, because these corrections will make the greatest contribution to speedup.

    For example, I had a problem with setting up gzip compression and setting the caching time for static files (images, css files and scripts) in user browsers, because Apache works in conjunction with nginx, but I don’t know how to work with it. I had to write to Infobox technical support with a request to set everything up - they did it, and didn’t even take the money (thanks to them!). By the way, initially they set the cache storage time to 1 hour, but Google PageSpeed ​​Insights still complained:

    I had to delve into the instructions for this online service and read there that the minimum required is 1 day of storage of statics in the cache. I asked hosting technical support to supply 1 week of reserve, which they did. Now the rating has increased a bit and Google doesn’t have any major complaints about my resource in terms of loading speed, which is good.

  • Test my Site is a new service, again from Google. Basically, he focuses on evaluating the mobile version of your site, including by the criterion of its loading speed:

    Simple and tasteful, as they say. You can subscribe to the newsletter for changes.

  • GTmetrix - again, “without further ado” enter the URL of the desired page and wait a bit for the analysis to complete. As a result, you will receive a report generated based on data from two browser plugins - Page Speed ​​(read about working with it below) and YSlow. Actually, what data to trust and whose recommendations to follow is up to you to decide.

    I have already written about this in some detail and therefore I will not repeat myself, so as not to clutter up an already cumbersome article (if you read to the end, you can consider yourself a hero).

  • Ping Admin is a similar online resource for measuring server response time from different parts of our vast planet.

  • Host Tracker is almost the same, only the countries are different.
  • ByteCheck - allows you to measure the TTFB (Time To First Byte) value for your site, which is often paid attention to during optimization. This is the time the browser receives the first byte of data from the server. The higher the TTFB value, the slower the server processes resources, which is bad. Read tips for optimizing website loading.
  • Load Impact is not entirely about speed, but it is also an important service. It allows you to test the load capacity of your site and whether the page loading speed decreases. A very useful thing.
  • Web Page Speed ​​is an online service with a design from the early nineties, but quite informative if you adapt to the lack of usability. Below are general recommendations for correcting the situation.
  • Is it really important to track page loading speed?

    But let's return from measuring speed to looking for opportunities to increase it. , previously there was an experimental “Site Performance” tab, where an assessment of the site’s loading speed was given.

    In general, there is nothing criminal in the loading speed indicated there, but the fact that my blog site loaded slower than 84 percent of all web resources on the Internet is already a bell that should have been responded to. But since Google thinks that my blog is an outsider in terms of speed, then it’s worth taking this problem seriously and seriously addressing the issue of how to increase it.

    Actually, there was no need to come up with anything special, because Google itself suggests the most optimal solution. More precisely, he suggests using a tool that, in turn, will help you understand what exactly needs to be done in order to speed up your site a little (or a lot). I'm talking about the online service Page Speed ​​(previously there were also browser extensions of the same name for FireFox and Chrome, which I mainly used).

    Let me make a reservation right away that this tool operates on quite complex and not entirely understandable things for ordinary webmasters, which are mainly related to the intricacies and nuances of the Web server’s operation. If you have never administered a server, it will be difficult.

    There is a way out - stress your hoster about performing the actions that Page Speed ​​prescribes. Whether he agrees or not is another question. I didn’t dare, because it’s awkward to give access to the server to just anyone (that’s how distrustful I am).

    On the main page, PageSpeed ​​even suggests installing the module on your server if it is running Apache or Nginx (which is my case):

    But I still don’t understand how this is done, because I don’t understand server administration at all and have never worked with Unix-like systems. This is much more difficult than installing a program or uploading a plugin into WordPress. Another level of immersion. Hoster also did not dare to bother him about this. In general, I have not tested this module - it is possible that you have already tried it and have something to say...

    In general, for the first time I used Page Speed ​​as a browser extension (now, as I understand it, it does not function). Previously, it was integrated into developer tools in Firefox and Chrome. True, at first (several years ago) I only briefly looked at what advice he was giving me, and, understanding almost nothing, I decided that this was not for me, after which, with a light heart, I deleted the PageSpeed ​​plugin as an unnecessary and alien element to my mind.

    The fact is that even understanding what this plugin complains about, I didn’t really know what needed to be done to fix all this and somehow speed up my blog. In general, I immediately found a lot of more important things to do, in comparison with which the mouse fuss with the Web server settings (especially since I don’t particularly understand them) seemed petty and insignificant.

    True, at one time I also shelved the solution to the security problem and, as a result, paid for it with the loss of all that money. Bearing in mind what happened, the other day I decided to dig my heels in and try to move forward with the problem of increasing the site loading speed, despite how incomprehensible and difficult this issue was for me.

    In short, I re-installed Page Speed ​​in Mazila (now I don’t need to do this anymore), looked at what exactly it complained about the most, and I still managed to improve some things and, I hope, at least increase the speed a little.

    P.S. Now Page Speed ​​can only be used online and is no longer required to be installed in the browser (in any case, this plugin is incompatible with new versions of Chrome), although this does not change the essence.

    So, previously you had to install a plugin in your browser, but now all you have to do is go here, enter the URL of the page you want to analyze (different types of pages may have different loading speed problems, so it makes sense to check all options in this tool) and Click on the blue “Analyze” button.

    After waiting for the results of the verification process, you will see a window similar to the one I have already shown in the screenshot above in the text (in paragraph 3 of the descriptions of online services for checking website loading speed). That. as a result, you will see a whole list of complaints that this online service has about your resource, namely about its loading speed. At the same time, he will give you some instructions on optimizing the operation of the Web server in conjunction with the engine you are using.

    Moreover, at the very top of the Page Speed ​​window there will be comments and recommendations that you will want to look at and change first (“fix them without fail”), because this will give the greatest effect in terms of increasing loading speed and will not require too much effort on your part. Let me give you an example of an analysis of one of my minor projects, which I don’t really get around to:

    Those. recommendations and detected problems, marked with a red rectangle with an exclamation mark and located at the very top of the list, are the most important and it is advisable to start optimization with them (cheap and cheerful, as they say), thereby achieving the greatest effect.

    Issues marked in orange will require more effort on your part to correct, but may not result in a very significant increase in speed. Alas and ah, for now you can put them on the back burner and get on with the priority tasks that will help significantly speed up the project.

    My initial picture a few years ago (even when using the plugin - now the same thing can be seen in http://gtmetrix.com/, because it uses the PageSpeed ​​API) for https://site was like this:

    I then decided to start from the very first point “Leverage browser caching” (now it’s called “Use the browser cache”), because according to the logic of Page Speed, these recommendations should lead to the greatest acceleration of my blog.

    If you click on the spoiler next to this inscription, a list of various files will appear that do not meet the optimal requirements for caching static objects (scripts, CSS files, image files used on a web page) in user browsers (i.e. readers):

    Those. To increase loading speed, PageSpeed ​​Insights advises us to optimally configure caching of various elements of web pages in user browsers so that when viewing others, these static elements are not reloaded from the server. In theory, this all sounds quite confusing, because I have no idea about the caching mechanisms used by browsers (read about that and how to clear it).

    In addition, we will optimize the caching of static objects using the mechanisms of your hosting server itself. It’s quite confusing, but I’ll offer you ready-made solutions found on the Russian Internet, and you’ll try them in action and decide which one will work best on your hosting.

    Optimizing browser caching and checking its operation

    True, this did not work on my current hosting, because I now have a combination of Apache and nginx (I need to configure the latter, which the hoster did for me in a way unknown to me). But if you have pure Apache, then the method suggested below may work.

    In general, we will try to influence the server where your project is hosted in such a way that it issues commands to browsers aimed at optimizing caching of static elements. We will do this using a fairly well-known remote server management tool - the .htaccess file. Do you know about the existence of such a thing?

    It usually lives in the root folder. Naturally, everything described below will only work on servers running Apache, but as a rule, they are the majority. After connecting to your resource via FTP (), open the root folder (usually either PUBLIC_HTML or HTDOCS) and check for the .htaccess file in it.

    From now on, you do everything at your own peril and risk. Therefore, be sure to download a copy of this file to your computer so that if something happens, you can quickly roll back.

    If .htaccess is not visible, then try in the FileZilla program, select “Server” - “Force hidden files” from the top menu. If even after this it does not appear at the root, then create an empty text file on your computer in any editor convenient for you (I use NotePad Plus Plus), name it something and copy it to the root.

    After that, rename this file to .htaccess in FileZilla. Now you will need to open it for editing and add the code below to it. But first, let me explain a little.

    The most popular ways to enable this option on a web server running Apache are using the mod_headers or mod_expires modules. The code below will help you enable static caching in the browser if the online service has at least one of these Apache modules installed on your server.

    First, I'll give the code for the mod_headers module. Please note that it uses a check to see if your hoster has this module. If it is not found, the code will not be executed and will not cause any errors. However, I once again strongly recommend that you first copy the original (before adding the code below) .htaccess file to your computer to avoid incidents.

    #cache html and htm files for one day Header set Cache-Control "max-age=43200" #cache css, javascript and text files for one week Header set Cache-Control "max-age=604800" #cache flash and images for month Header set Cache-Control "max-age=2592000" #disable caching Header unset Cache-Control

    You can delete comments (their lines begin with a hash sign), but they will not affect the operation of the code in any way.

    It will also be possible to add a block of code designed for the mod_expires module, which again uses a check for its presence on your server, which guarantees the safety of using this piece of code:

    ExpiresActive On #default cache is 5 seconds ExpiresDefault "access plus 5 seconds" #cache flash and images for a month ExpiresByType image/x-icon "access plus 2592000 seconds" ExpiresByType image/jpeg "access plus 2592000 seconds" ExpiresByType image/png " access plus 2592000 seconds" ExpiresByType image/gif "access plus 2592000 seconds" ExpiresByType application/x-shockwave-flash "access plus 2592000 seconds" #cache css, javascript and text files for one week ExpiresByType text/css "access plus 604800 seconds" ExpiresByType text/javascript "access plus 604800 seconds" ExpiresByType application/javascript "access plus 604800 seconds" ExpiresByType application/x-javascript "access plus 604800 seconds" #cache html and htm files for one day ExpiresByType text/html "access plus 43200 seconds " #cache xml files for ten minutes ExpiresByType application/xhtml+xml "access plus 600 seconds"

    Comments can again be deleted later.

    If it doesn't work, but you're hoping for a miracle, here are a few more variations of the same code, but try them one at a time, not all at once:

  • ExpiresActive On ExpiresByType application/javascript "access plus 1 year" ExpiresByType text/javascript "access plus 1 year" ExpiresByType text/css "access plus 1 year" ExpiresByType image/gif "access plus 1 year" ExpiresByType image/jpeg "access plus 1 year" ExpiresByType image/png "access plus 1 year"
  • Header set Cache-control: private Header set Cache-control: public
  • BrowserMatch "MSIE" force-no-vary BrowserMatch "Mozilla/4.(2)" force-no-vary
  • FileETag MTime Size ExpiresActive on ExpiresDefault "access plus 1 month"
  • Now, after you have inserted code into .htaccess that allows you to increase speed by optimizing caching in the browser on the visitor’s side, and saved the changes made, check your resource page again in PageSpeed ​​Insights and make sure that the problem is gone:

    As you can see, in my case, “Use your browser cache” is no longer a critical flaw that slows down loading, and the icon next to this note has changed to orange, but not to green. Unfortunately, I am not able to influence third-party services from where my site loads static content (such as Yandex, Google, Feedburner and Aptulaik).

    Q.E.D. So, just like that, we figured out one of the most significant and significant problems found in Page Speed.

    How to enable compression of static objects on the server

    Also, a very common problem that the GTmetrix service complains about is the lack of compression of files on the server before transferring them to users’ browsers.

    It is used in this case, which I already wrote about. If you analyze not directly through PageSpeed ​​Insights, but through GTmetrix, then in the PageSpeed ​​area “Enable compression” is called “Enable gzip compression”, and in YSlow it is called “Compress components with gzip”.

    In order to enable this same Gzip compression on hostings where the Apache server is used, it will be enough to add the corresponding piece of code to the .htaccess file (it is a remote server control file). Apache has two modules for compression and one of them will be installed by your hoster (although this is not a fact).

    The most common one - let's start with it. We again add a check for the presence of this module to the code so as not to receive a 500 error for the entire site.

    AddOutputFilterByType DEFLATE text/html text/plain text/xml application/xml application/xhtml+xml text/css text/javascript application/javascript application/x-javascript

    Slightly less popular, the code for enabling Gzip compression for the required file types will look like this:

    mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file \.(html?|txt|css|js|php|pl)$ mod_gzip_item_include mime ^text\.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_exclude mime ^image\.* mod_gzip_ item_exclude rspheader ^ Content-Encoding:.*gzip.*

    Actually, try and check the page in PageSpeed ​​Insights after installing the code. If the problem goes away, then consider yourself lucky. Due to the presence of Apache with nginx, all this did not help me (the hoster said that nginx is responsible for the statics, in this situation it is necessary to configure it - I don’t know how he did it).

    Good luck to you! See you soon on the pages of the blog site

    You might be interested

    Measuring and increasing site speed in GTmetrix, as well as setting up loading the jQuery library from Google CDN
    Gzip compression to speed up site loading - how to enable it for Js, Html and Css using the .htaccess file
    How to maximize website loading speed and optimize server load
    Website acceleration - what it does, how to measure it and how to speed up the website yourself
    Creating CSS sprites in the Sprites Me online generator to reduce the number of requests to the server
    CSS optimization and compression in Page Speed ​​- how to disable external stylesheet files and merge them into one to speed up loading
    How to get a fast website - optimization (compression) of images and scripts, as well as reducing the number of Http requests

    PageSpeed ​​Insights (PSI) reports on the performance of a page on both mobile and desktop devices, and provides suggestions on how that page may be improved.

    PSI provides both lab and field data about a page. Lab data is useful for debugging performance issues, as it is collected in a controlled environment. However, it may not capture real-world bottlenecks. Field data is useful for capturing true, real-world user experience - but has a more limited set of metrics. See for more information on the 2 types of data.

    Performance score

    At the top of the report, PSI provides a score which summarizes the page’s performance. This score is determined by running to collect and analyze about the page. A score of 90 or above is considered fast, and 50 to 90 is considered average. Below 50 is considered to be slow.

    Real-World Field Data

    When PSI is given a URL, it will look it up in the (CrUX) dataset. If available, PSI reports the (FCP) and the (FID) metric data for the origin and potentially the specific page URL.

    Classifying Fast, Average, Slow

    PSI also classifies field data into 3 buckets, describing experiences considered fast, average, or slow. PSI sets the following thresholds for fast / average / slow, based on our analysis of the CrUX dataset:

    Fast Average Slow
    FCP (1000ms, 2500ms] over 2500ms
    FID (50ms, 250ms] over 250ms

    Generally speaking, fast pages are roughly in the top ~10%, average pages are in the next 40%, and slow pages are in the bottom 50%. The numbers have been rounded for readability. These thresholds apply to both mobile and desktop and have been set based on human perceptual abilities.

    Distribution and selected value of FCP and FID

    PSI presents a distribution of these metrics so that developers can understand the range of FCP and FID values ​​for that page or origin. This distribution is also split into three categories: Fast, Average, and Slow, denoted with green, orange, and red bars. For example, seeing 14% within FCP"s orange bar indicates that 14% of all observed FCP values ​​fall between 1,000ms and 2,500ms. This data an aggregate view of all page loads over the previous 30 days.

    Above the distribution bars, PSI reports the 90th percentile First Contentful Paint and the 95th percentile First Input Delay, presented in seconds and milliseconds respectfully. These percentiles are so that developers can understand the most frustrating user experiences on their site. These field metric values ​​are classified as fast/average/slow by applying the same thresholds shown above.

    Field data summary label

    An overall label is calculated from the field metric values:

    • Fast: If both FCP and FID are Fast.
    • Slow: If any either FCP or FID is Slow.
    • Average: All other cases.
    Differences between Field Data in PSI and CrUX

    The difference between the field data in PSI versus the Chrome User Experience Report on BigQuery, is that PSI’s data is updated daily for the trailing 30 day period. The data set on BigQuery is only updated monthly.

    Lab data

    PSI uses Lighthouse to analyze the given URL, generating a performance score that estimates the page"s performance on different metrics, including: , and .

    Why do the field data and lab data contradict each other? The Field data says the URL is slow, but the Lab data says the URL is fast!

    The field data is a historical report about how a particular URL has been performed, and represents anonymized performance data from users in the real-world on a variety of devices and network conditions. The lab data is based on a simulated load of a page on a single device and fixed set of network conditions. As a result, the values ​​may differ.

    Why is the 90th percentile chosen for FCP and the 95th percentile for FID?

    Our goal is to make sure that pages work well for the majority of users. By focusing on 90th and 95th percentile values ​​for our metrics, this ensures that pages meet a minimum standard of performance under the most difficult device and network conditions.

    Why does the FCP in v4 and v5 have different values?

    V5 FCP is looking at the 90th percentile while v4 FCP reports the median (50th percentile).

    What is a good score for the lab data?

    Any green score (90+) is considered good.

    Why does the performance score change from run to run? I didn't change anything on my page!

    Variability in performance measurement is introduced via a number of channels with different levels of impact. Several common sources of metric variability are local network availability, client hardware availability, and client resource contention.

    More questions?

    If you"ve got a question about using PageSpeed ​​Insights that is specific and answerable, ask your question in English on Stack Overflow.

    If you have general feedback or questions about PageSpeed ​​Insights, or you want to start a general discussion, start a thread in the mailing list .

    Feedback

    Was this page helpful?

    Yes Great! Thank you for the feedback. If you have a specific, answerable question about using PageSpeed ​​Insights, ask the question in English on Stack Overflow mailing list. No Sorry to hear that. If you have a specific, answerable question about using PageSpeed ​​Insights, ask the question in English on Stack Overflow. For general questions, feedback, and discussion, start a thread in the

    Page loading speed is now a very powerful signal for search engines. And for users this is a significant factor, which is difficult not to pay attention to if there are problems with it. By improving site speed, you can not only get ranking benefits, but also gain more trust and conversion rates. Below is a list of the most useful tools that will help you analyze and identify the weakest points of your site in terms of speed.

    1. Google PageSpeed ​​Insights

    Google's Page Loading Speed ​​Tool. Shows a value from 0 to 100 for both desktop and mobile devices. He immediately points out the weak points of the site and gives recommendations for optimizing speed.

    2. Pingdom Tools

    Gives an assessment of speed, shows the number of calls to the server and the average loading time. The summary table will display in detail the data for each request to the server (styles, scripts, images, etc.). It’s easy to assess what exactly on the site is slowing down loading.

    3.WhichLoadFaster

    Load two sites for comparison (yourself and a competitor), visually observe which loads faster (convenient to demonstrate to clients). At the end of the download, information is displayed which site won and how many times faster it loaded.

    4. Web Page Performance Test

    Loads the page twice, compares the number of hits - reveals how well caching is organized, shows detailed statistics for each test. Saves screenshots of how the site looks every second of loading. It also shows in a convenient form which group of requests took the most time. The server is located in Dallas (USA).

    5. GTmetrix

    Another useful tool for testing site speed. Displays a lot of summary information, also stores a history so you can compare how much your download speed has improved or worsened. Loads Yahoo and Google recommendations for speed optimization, sorting them by priority. The test server is located in Vancouver (Canada).

    6.Load Impact

    The service tests how much the site can withstand the load (light DDOS). Several dozen users and more than a hundred active connections are emulated. Since the test lasts several minutes, other tools can be used during this load time to evaluate page loading speed during rush hour. At the end of the test, you can see a graph of how the download speed changes depending on the number of active users.

    7. Monitis Tools

    Analyzes website loading from different parts of the Earth - servers in the USA, Europe and Asia. Displays summary statistics for each test.

    8.SiteSpeed.me

    Sends requests to the analyzed page from different data centers (about 30 servers) and determines the speed for each of them. Highlights the best, worst and average performance in time and speed.

    9. PR-CY

    Mass website speed check. You can specify up to 10 addresses - thus comparing the loading time and document size for each resource.

    10. WebPage Analyzer

    Report on page loading and all additional scripts/styles/images. A simple and often necessary tool.

    If you use any other free online tools to check website page loading speed, please share them in the comments.

    Many of you have probably used the wonderful service from Google: PageSpeed ​​Insights? Do you want to get the coveted 100 out of 100?

    Picture to attract attention

    But it's up to the little one.

    So here are the results of my test. Let’s take any website, for example, I took a free ready-made adaptive website template and transferred it to my hosting and started testing: Result of the first testing (link to the website):
    • speed for mobile - 79/100;
    • speed for a computer - 93/100;
    Not bad huh?

    Complains about:

    Be sure to correct:
    Remove rendering-blocking JavaScript and CSS from the top of the page.
    Number of blocking CSS resources per page: 3. They slow down the display of content.
    All content at the top of the page is displayed only after the following resources have been loaded. Consider delaying the loading of these resources, loading them asynchronously, or embedding their most important components directly into the HTML code.
    We do some small tricks. We transfer styles from the file to the code:
    Was:


    Became:

    article, aside, details, figcaption, figure, footer, header, hgroup, nav, section ( display:block; ) /* and other styles */
    And - hurray! - we have higher results (link to site):

    • speed for mobile - 99/100;
    • speed for a computer - 99/100;
    And he only complains about: Correct if possible:
    Shorten HTML
    Compressing HTML code (including inline JavaScript or CSS code) reduces the amount of data for faster loading and processing times. But this problem can be solved by compressing the code. Not related to this topic.
    And we also do not forget that we still have not solved the problem described above:
    All content at the top of the page is displayed only after the following resources have been loaded. Consider delaying the loading of these resources, loading them asynchronously, or embedding their most important components directly into the HTML code. As much as they weighed in the file, they weigh the same in the code!

    And now the most important question: Bug or feature?
    Thank you!

    mob_info