Browserscope: how does your browser compare?

September 13, 2009 7:06 pm | 1 Comment

A year ago this week, I launched UA Profiler – a framework for measuring browser performance. Today, Browserscope takes browser profiling to the next level. Browserscope is a full blown open source project, headed up by Lindsey Simon (see his blog post). The framework has been expanded to contain multiple test categories:

These initial tests highlight the strength of Browserscope as a framework to host multiple categories, whether they be new or existing tests. Other test categories being planned include CSS reflow, cookies, and security.

Test results are crowdsourced – anyone can point their browser at Browserscope and run one or all of the tests. Crowdsourcing produces greater browser coverage tested under real world conditions, and lowers Browserscope’s resource requirements meaning the project can run in perpetuity.

We need you to contribute. If you have tests you’d like to see added to Browserscope, review the code, submit a bug, or contact the group. And take a few minutes right now to find out how your browser ranks, and share your results to help build up this repository of browser behavior.

How Does Your Browser Compare?

One Response to Browserscope: how does your browser compare?

  1. The “Max Connection per Hostname” test fails to give true result sometimes. I suppose the methodology is to have multiple inline images in the page. Each image would take some delay to download. The client side javascript just count how many images are loaded after given timeout.

    If the network has some latency. This test methodology would give results smaller than true value. But, I cannot figure out better and more accurate way to measure it.