WebPerfDays: Performance Tools

October 9, 2012 5:06 pm | 25 Comments

I just returned from Velocity Europe in London. It was stellar. And totally exhausting! But this post is about the other fantastic web performance event that took place after Velocity: WebPerfDays London.

WebPerfDays is like a day-long web performance meetup event. Aaron Kulick organized the first one last June right after Velocity US. He had the brilliant idea to grab webperfdays.org with the goal of starting a series of events modeled after DevOpsDay. The intent is for other people to organize their own WebPerfDays events. All the resources are meant to be shared – the website, domain, templates, Twitter handle, etc.

Stephen Thair continued the tradition by organizing last week’s WebPerfDays in London. It was held at Facebook’s new London office. They contributed their space on the top floor with beautiful views. (Sorry for the broken sofa.) WebPerfDays is an UNconference, so the agenda was determined by the attendees. I nominated a session on performance tools based on two questions:

What’s your favorite web performance tool?
What tools are missing?

Here are the responses gathered from the attendees:

Favorite Performance Tool:

Updates:

Missing Tools:

  • When analyzing a website need a tool that calculates the average delta between last-modified date and today and compare that to the expiration time. The goal is to indicate to the web developer if the expiration window is commensurate with the resource change rate. This could be part of PageSpeed, YSlow, and HTTP Archive, for example.
  • Automated tool that determines if a site is using a blocking snippet when an async snippet is available. For example, PageSpeed does this but only for Google Analytics.
  • Tools that diagnose the root cause for rendering being delayed.
  • Easier visibility into DNS TTLs, e.g., built into Chrome Dev Tools and WebPagetest.
  • Backend tool that crawls file directories and optimizes images. Candidate tools: Yeoman, Wesley.
  • Nav Timing in (mobile) Safari.
  • Better tools for detecting and diagnosing memory leaks.
  • Web timing specs for time spent on JavaScript, CSS, reflow, etc. (available in JavaScript).
  • Tools to visualize and modify Web Storage (localStorage, app cache, etc.).
  • Tools to visualize and clear DNS cache.
  • A version of JSLint focused on performance suggestions.
  • A tool that can diff two HAR files.

Updates:

  • in-browser devtools letting you drill into each resource fetched or cached, listing the full set of reasons (down to the combination of http headers at play in the current and, as applicable, a prior request) for why that resource was or wasn’t loaded from the cache, when it would get evicted from cache and why: https://bugs.webkit.org/show_bug.cgi?id=83986

This was stream of consciousness from the audience. It’s not an exhaustive list. Do you have a favorite web performance tool that’s not listed? Or a performance analysis need without a tool to help? If so, add a comment below. And consider organizing WebPerfDays in your area. Aaron, Stephen, and I would be happy to help.

25 Responses to WebPerfDays: Performance Tools

  1. Torbit Insight (torbit.com) to measure your site speed and quantify how much speed impacts your $$. It’s free and uses the Episodes library to give you accurate loading times for every visitor on every page.

  2. Nice idea about Local meetings

    About Missing tools, “Backend tool that optimizes images” is not missing ;) -> Grunt.js http://gruntjs.com/ (search in the plugins “images” to see different tools that can help. Specially grunt-imagine looks perfect.

    About “Better tools for detecting memory leaks” is something I would really like to see. Chrome Dev Tools Memory Heap snapshots is a first step, but still too obscure.

  3. Another missing tool: something that analyzes elements on your page after one page load and tells you, depending on the time to download elements and their order, on which sharded domain you should place your content to get the best performance.
    Example: CSS1 should go on static1.example.com, CSS2 on static2.example.com, Image 1 on static1, etc etc.
    Of course it should also depend on how much parallel downloads you can perform.

  4. Hi Steve,
    I’ve just released a tool called sitespeed.io, that analyzes multiple pages of
    a site, using Yslow as a backend, but with new modified rules. The new rules are for example spof,
    css print files and minimizing DNS lookups inside head.

    You can check it out at http://sitespeed.io

    BR
    Peter

  5. I’d like a tool that spiders the whole site and then applies smushit-style checks to each page. It would then provide a report of all pages where image optimisations can be made, with the ability to sort results by page so you can prioritise the pages where the most savings are possible.

  6. Great list – just created feature requests for WebPagetest for many of them.

  7. I’d add Qualys SSL Test (https://www.ssllabs.com/ssltest/) to the list.

    It’s very handy for looking at SSL cert chains and whether they can be reduced.

  8. I added these suggestions.

    @TOMHTML: What are the constraints that determine the assignments? This seems too fragile to me. Do you re-run it everytime you change your pages? I prefer picking a sharded domain based on a hash of the filename.

    @Olly: I think you can do that with the tools mentioned, or at least something close (like crawl the file directories).

  9. Missing tool: in-browser devtools letting you drill into each resource fetched or cached, listing the full set of reasons (down to the combination of http headers at play in the current and, as applicable, a prior request) for why that resource was or wasn’t loaded from the cache, when it would get evicted from cache and why:

    https://bugs.webkit.org/show_bug.cgi?id=83986

  10. I’d add the new free tool from Appneta called SpeedCheckr (see http://www.appneta.com/speedcheckr/faq/). Not a dev tool but a unique way to continuously monitor the capacity, utiilisation, loss and RTT on your Internet connection. Other free tools are at their site http://www.appneta.com but this has the most relevance to Internet performance.

  11. here is few:
    siege: http://www.joedog.org/siege-home/
    tsung: http://tsung.erlang-projects.org/

  12. @Steve I’m OK with your idea but definetely you can’t be sure that using a hash to get the sharded domain is always the best solution. Perhaps it’s the worse! Example: let’s say you use 2 sharded domains, and perform “hash(file) modulo 2” to get the domain. You have 6 connections by domain. If you are unlucky you could get your first 10 CSS/JS/Images on sharded #1, and maybe only 3 elements on sharded #2.
    I might be wrong but I suppose elements are always downloaded in the same order (if no primed cache).
    With WebPageTest for example, the tool can easily know all ressources downloaded, the list of sharded domains (or you configure it) and the time it took to download each element. On the basis that the download time and DNS resolving time is the same for all sharded domains, and that you don’t have a really small bandwidth, WebPageTest (or another tool) might be able to tell which element should be placed on which sharded domain, in order to avoid blocking and minimize total load time.
    One could imagine that alternating sharded domains could suffice (#1, #2, #1, #2, …) but this logic fails when all element don’t have the same size. I’m not good at maths, but I think this problem might be solved with Dijkstra’s algorithm.

  13. Awesome list, Steve, we should use it at Meet4SPEEDs.

  14. @TOMHTML: It’s important to always use the same domain for a specific resource – otherwise it won’t be read from cache. If you alternative domain assignment, as you suggest, and the page changes then resource that are already in the user’s cache under domain A would have to be re-downloaded if they’re now assigned to domain B. The best solution is one that always produces the same domain assignment for any given resource.

  15. One of my favorite tools (I may be a little biased) is WebSiteTest (http://websitetest.com). It provides the basics that you see in WebPageTest, plus a lot of extra functionality. The big difference is the multi-variant testing which allows me to setup a single test configuration that actually grabs samples from multiple browsers, locations, or connectivity types. And I can run more that the 10 I am limited to on WebPageTest and unless you turn it off, we always capture the screenshots.

  16. Are there any tool to check if a JS is being used?

    I use this tool to check if a CSS archive is being used.

    http://unused-css.com/

  17. @SteveSouders I’m OK with you, we should always chose the same sharded domain for a given resource. Moreover, I don’t say “recalculate everything each time you add a CSS line” (few bytes shouldn’t change anything), I’m talking about creating a new website and put the right resources at the right place before it goes public.
    I imagine a test of WebPageTest, that would analyze sharded domains for a website and alerting me because I’m fetching 10 times consecutively from the same sharded domain (just and example here) and that will block downloading due to per-domain connections limits. So I could change attribution for resources/domain before the site goes public.

  18. @cristobal This is what you are looking for, right? https://code.google.com/p/script-cover/

  19. AT&T’s Application Resource Optimizer (ARO):
    http://developer.att.com/ARO
    https://github.com/attdevsupport/ARO

    Free and open source tool to grab packet traces off of your mobile device.

  20. RE: Missing Tools:
    •When analyzing a website need a tool that calculates the average delta between last-modified date and today and compare that to the expiration time. The goal is to indicate to the web developer if the expiration window is commensurate with the resource change rate. This could be part of PageSpeed, YSlow, and HTTP Archive, for example.

    Try Showslow (showslow.org) and connect it with YSlow, Dynatrace ajax, WebPageSpeed, and I believe you can add to the list of beacons, too.

  21. @TOMHTML THANKS!!

  22. Another one I just learned about: Windows Performance Toolkit. It seem to be a generic performance monitoring framework for windows. Yet it has some web specific feature. See blog post from 2010 at: http://blogs.msdn.com/b/ie/archive/2010/06/21/measuring-browser-performance-with-the-windows-performance-tools.aspx

    I learned about this while watching Jatinders Mann talk on performance at Build 2012: http://jatindersmann.com/2012/10/31/build-2012-50-performance-tricks-to-make-your-html5-apps-and-sites-faster/

  23. You mentionned PhantomJS, I suggest you also take a look at CasperJS who offers a nice API for testing and scripting a headless webkit browser :

    http://casperjs.org/

  24. Check out Revealed, which is great for marketers and business users to understand the impact of performance on their KPIs, and allows for competitive performance benchmarking:

    http://www.rvealed.com/try

  25. Apache JMeter is missing. Also JMeter testing cloud should be added to the list, I think.