I'm now at SpeedCurve!

Google AJAX Libraries API

May 27, 2008 10:47 pm | 10 Comments

Today Dion Almaer announced the Google AJAX Libraries API. This is a great resource for developers using any of the popular JavaScript frameworks including Prototype, Script.aculo.us, jQuery, Dojo, and MooTools. Rather than downloading it to your own server and hosting it from there, you can request your preferred JavaScript library from ajax.googleapis.com:

<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js">
</script>

The greatest benefit in my opinion is that any developer can leverage Google’s CDN to deliver the bulk of their JavaScript. From YSlow and my book I received a lot of feedback that Rule 2: Use a CDN was out of reach for many developers. I use jQuery in one of my personal projects and serve it on my hosted web site from one geographic location. Ouch. Being able to move that 21K closer to my users is great.

Other benefits: There’s a community benefit here. As more and more sites move to these files users who navigate across those sites may only have to download the file once and read it from cache for all the other sites. Also, the versions served up by Google have an Expires date set one year in the future and are gzipped. Three of the libraries also have a minified version which reduces size another 13-21% by minifying the JavaScript before gzipping.

Library Original Gzipped Minified &
Gzipped
jQuery 100K 30K (30%) 17K (17%)
MooTools 183K 49K (27%) 19K (11%)
Dojo 294K 90K (31%) 29K (10%)

All of these performance optimizations are great to see. YUI, another great alternative, is already offered via Yahoo!’s CDN from yui.yahooapis.com with similar performance features. Already there are nearly 50 comments on the Ajaxian post. Performance is a hot topic now as we push browsers to their limits. Services like this are just what’s needed to help us make the user experience as fast as possible.

10 Responses to Google AJAX Libraries API

  1. Sounds good! I was thinking about community-hosting the libraries above, this is even better.

    I was thinking that adding this link to a page forces the browser to potentially do one DNS lookup more.

    It would be interesting to know how large the distribution of the above files is, e.g. when the threshold of cache vs. dns lookup is reached.

    And one more thing: how good is the google CDN outside the US? For example here in Germany many US-based CDNs only have one POP. With 99% of our users coming from Germany using a CDN may actually reduce the speed, since our servers sit right in the second-largest DSL providers main datacentre.

  2. @Jan: I’ve tested Google CDN and the performances are astonishing ! Here in France, Mootools yui compressed is like 100ms to load according to firebug. Even compared with my version of Mootools (without a lot of components), it’s pretty much the same delay to load !

    Google has done a terrific work here. Though, the big problem is that they only offer full size versions of the framework, that means useless components are loaded.

    I think it mainly aims the heavy web apps, but doesn’t aim the little uses of JS you can see on blogs for example.

  3. Can’t wait to use it.

  4. Does anyone know what minification process Google is using?

  5. Steve, if you can put in the good word to the API team to host older versions of these libraries that would be awesome!

  6. Dan: The minified versions were provided by the framework teams themselves.

    John: The choice of which versions to serve was determined by the framework teams.

  7. Steve: I was able to attend your session at the recent Google I/O conference – great preso. Any chance you will be posting the slides – wanted to share with a couple of my colleagues.

  8. Brian: Thanks for the feedback. That was a lot of deep material. I worked hard to use good examples and clear data. It’s nice to hear it was well received.

    The slides are available on my front page: http://stevesouders.com/ (click on the “slides” link next to “Google I/O”).

  9. Steve,

    I realize this has been around for a couple years, and I love the concept, but I’ve been stalling to implement it for a certain site because we have time-sensitive functionality (real-time bidding on auctions), and despite the fact Google’s servers are generally extremely fast, sometimes they’re slow, and that could have major consequences for us.

    I realize there are workarounds to load jQuery from our own server when Google is down or blocked, but is there anything we can do for times when Google is just plain slow? The reasoning is we can justify the occasional problem on our own servers, because if jQuery is not loading, we have bigger problems, and our customers are probably aware, and we’ll make sure we accommodate those users in our auctions, but if Google CDN starts running slowly for customers in a certain part of the country, we won’t know in time, and they won’t be happy.

    Are we being overly concerned? Are we simply a bad candidate for Google CDN, due to the nature of our site? Or is there a workaround so that we can avoid this becoming an issue?

    It seems like it would be ideal if we could just specify a callback that would run when the Google Loader API noticed poor response times.

    I haven’t found any good solutions for this, so I’m hoping you can help.

    Thanks,
    Richard

  10. @Richard: First step is to gather some data. I see this often where people spend a lot of effort based on anecdotal data or perceived performance, and it’s usually wrong. What’s the average, median, and 95% load time for jquery from Google vs your own servers? Since you mostly discuss extreme outages (eg, response time > 10 seconds), what percentage of time is there an “outage” on Google vs your servers?

    I’m willing to bet you’re going to find Google Ajax Libraries is better in every metric, but for the sake of discussion let’s assume there’s enough of an issue that you’d like to pursue this. Even if the outages are rare, it’d be nice to have a fallback because it’s less likely both Google and your servers would be down at the same time.

    I’ve never worked on this before, but you could use something like the image/object script loading technique from ControlJS/Stoyan and load BOTH versions of jquery simultaneously. Remember that with these techniques the script is downloaded but is NOT parsed & executed. Then, whichever script comes back first is the one you’ll use.