Google PageInsights Needs More Polish

Walmart Remains the Underdog in the Future of Retailing
March 28, 2017
Merry Christmas, Walmart, from your Friends at Amazon
November 6, 2017
Show all

Disclaimer: No one crunches numbers like Google, probably not even the NSA or U.S. military.

But that doesn’t mean it’s number crunching is perfect.

As part of its business strategy of becoming the de facto maker/enforcer of standards on the Internet, Google released PageInsights four years ago. The concept is simple. Put your web address into its little scanner and it quickly analyzes the site and rates both the desktop and mobile versions on a scale of 0 to 100, then spits out a few suggestions on how to improve.

Problem is, many people confuse the scale/ratings with the actual speed of the site. Google itself admits it is not a speed test. It is simply a measure of how your site compares with what Google would consider an ideally optimized site based on its performance algorithms.

The problem is that at least from our view, the algorithms really have not evolved much since PageInsights was released. This was especially clear on a site I optimized last week. After tweaking hundreds of images to the absolute bare minimum of acceptable quality, there were still eight images Google said “needed more work.”

Intrigued, I checked the details and learned that Google thought I could shave a whopping 19KB off of all 8 images combined. An example, squeezing 1.1KB out of a social icon could reduce the size of the image by a whopping 49%!

I stared at the report and sighed. Would making any changes to the images materially improve download speeds for my client and his customers? Not even close. Kind of the web equivalent to spitting in the ocean.

And so why do I care enough to call Google out on this? It’s because I have seen other reports where clients were using huge photos on the front page … 15000000KB images that really should be 100KB. Such a gaff slows the site dramatically for the site owner and his customers. Yet Google appears to weight these obvious errors the same as the 1.1KB. To Google, it is either black or white. There is no gray.

Am I naive enough to think Google couldn’t adjust its algorithm so that it was actually helpful to site owners? No, and that’s what irks me.

When it released the 100 point scale, Google, with all the science behind game theory, knew that people would obsess about hitting 100 percent. What they didn’t obsess about was tweaking the algorithm so that it would prioritize problems and actually help site developers build better sites.

The other funny/tragic part of what they have done is to build little trip holes into the rating system. One of the most difficult reporting areas to resolve are “leverage browser caching” recommendations. It’s easy to fix caching settings on content we host, but when the content is stored on another server, we lose the ability to specify caching.

Google knows that. But want to guess whose third-party content most frequently flunks the test? Google’s. That’s right kids, Google doesn’t specify browser caching times on Google Analytics, Google Maps, Google Web Fonts, etc.

So Google has chosen to dangle the magical 100% carrot out there without addressing obvious fixes it could make on its end to make the target more achievable.

Maybe it’s not polite to call the behemoth out. I don’t much care. The company that went public on the coattails of it’s “Don’t Be Evil” pledge could do better. On a scale of 0 to 100, I’d give PageInsights a rating of about 65 with “more work needed.”

My advice? If you care about your users and the actual speed of your site, analyze it with a tool built for that purpose like Pingdom’s speed test. Go ahead and run PageInsights, but take its findings with a grain of salt. Or maybe an aspirin. It’s more marketing hokum than meaningful help.

Leave a Reply