Assessing page performance
By Natalie Forman • Sep 3, 2017
Because I was unfamiliar with site-speed terminology and methods before this project, I started simple: googling ways to increase site performance. While there were many helpful articles, it was hard to determine which suggestions were applicable to Industry Dive. So instead of looking for solutions, I read Google’s PageSpeed Insights Rules and took a Udacity course called Website Performance Optimization for a more education-based approach. These tools familiarized me with the keywords I would continually see in page speed articles. They also taught me how to evaluate our publication site’s current speed and determine when files are being optimally loaded.
Current Speed Status
Once comfortable with the lingo, I needed to find our site’s weaknesses. I used two different tools, Google’s PageSpeed Insights and WebPageTest. These tools provided diagnostic information and scored our site performance based on their benchmarks. While one goal was to improve these scores, they were only one success metric as a site can be fast and user friendly without a perfect score.
Google’s analysis provided reasoning and recommendations to fix the speed problems it diagnosed. When running Google’s tool on our home page, we scored poorly in terms of benchmarks, with about a 56/100 on mobile and 68/100 on desktop.
While the results were discouraging, the suggestions for improvement were understandable and easy enough to start addressing. For example, it recommended compressing and caching all of our images, a simple suggestion with large performance potential.
To further comprehend our current site’s performance status, I used WebPageTest to audit our overall results and visualize the render waterfall. WebPageTest provided technical feedback, including time until render, time until interactive and number of bytes in and out. It also recorded the page load and let me watch the video to see the order the page rendered. This helped determine which files to potentially move or combine so they would no longer hold up the render process. I also used their second load statistics, which showed the page results on reload after the assets were cached. I could then evaluate the experience for returning users. In general, WebPageTest exposed the large number of files being called and the inefficient render order of our scripts.
After determining areas for growth, I needed to decide how to test a change to make sure it was improving the performance. I made a point to actively test the changes both before implementing them and after. To do this, I initially thought we could compare the live site against development branches. This didn’t work, because the local server was so slow that testing across the same site was resulting in a 20 second split. So, determining statistical significance was not going to be possible using that method.
Instead, I ended up hosting the modified branches and our current master branch on a test server. I then ran comparisons using WebPageTest. After testing the changes, I determined how to evaluate them and their “success”. I would monitor three different benchmarks: the overall speed index, the time until the page started to render and when the DOM first became interactive. All of these could be tested using WebPageTest. Altogether, I ran nine tests on both the original and test sites and averaged the results. I then compared these values to see if they were significant enough to implement the change.
I was now ready to start making adjustments to improve our page performance. Based on the evaluation results, I broke the next steps into four different goals.
- Optimize image rendering
- Reduce blocking scripts
- Prioritize visible content
- Strategically serve HTML and CSS
I will address each of these improvements in coming blog posts to explain our changes and their results.