First of all, the title is perhaps misleading. Basically, don't put plain script
tags that are not async
in the head
tag.
If you put a piece of javascript in the head of HTML page, the browser will start to download that and proceed down the lines of HTML and download other resources too as it encounters them such as the CSS files.
Then, when all javascript and CSS has been downloaded it will start rendering the page and when it does that it will download any images referenced in the HTML. At roughly the same time it will start to display things on the screen. But it won't do this until the CSS and Javascript has been downloaded.
To repeat: The browser screen will appear blank. It won't start downloading any images if downloading a javascript URL referenced in the head gets stuck.
Here are two perfectly good examples from this morning's routine hunt for news:
Here's what getharvest.com does in their HTML:
<!DOCTYPE html>
<html lang="en">
<head>
<script type="text/javascript">var NREUMQ=NREUMQ||[];NREUMQ.push(["mark","firstbyte",new Date().getTime()]);</script>
<script type="text/javascript" src="http://c761485.r85.cf2.rackcdn.com/gascript.js"></script>
...
Why it gets stuck on connecting to c761485.r85.cf2.rackcdn.com
I just don't know. But it does. The Internet is like that oftentimes. You simply can't connect to otherwise perfectly configured web servers.
Update-whilst-writing-this-text! As I was writing this text I gave getharvest.com a second chance thinking that most likely the squirrels in my internet tubes will be back up and running to rackcdn.com
but then [this happened!/static/cache/bd/02/bd02367be6bbe6d16444051619d88bee.jpg)
So, what's the right thing to do? Simple: don't rely on external resources. For example, can you move the Javascript script tag to the very very bottom of the HTML page. That way it will render as much as it possibly can whilst waiting for the Javascript resource to get unstuck. Or, almost equally you can keep the script
tag in the <head>
but then but in async
attribute on it like this:
<script async type="text/javascript" src="http://c761485.r85.cf2.rackcdn.com/gascript.js"></script>
Another thing you can do is not use an external resource URL (aka. third-party domain). Instead of using cdn.superfast.com/file.js
you instead use /file.js
. Sure, that fancy CDN might be faster at serving up stuff than your server but looking up a CDN's domain is costing one more DNS lookup which we know can be very expensive for that first-time impression.
I know I'm probably guilty of this new on some of my (now) older projects. For example, if you open aroundtheworldgame.com it won't render anything until it has managed to connect to maps.googleapis.com
and dn4avfivo8r6q.cloudfront.net
but that's more of an app rather than a web site.
By the way... I wrote some basic code to play around with how this actually works. I decided to put this up in case you want to experiment with it too: https://github.com/peterbe/slowpage
Comments
Post your own commentThis actually applies to css too, except you can't make that async.
That's why I'm moving away from putting .css on CDNs. My Nginx is fast enough. Also, I've started using inline CSS instead to make that experience even faster.
Browsers will absolutely start downloading images while waiting on a script to load, last I checked.
But yes, any script not needed to actually render the page should go at the bottom.
Not sure actually. Try my slowpage code and you'll notice that photo.jpg is NOT downloaded till after script.js is finished.
Are you measuring that with a packet sniffer, or some other tool?
If that's actually happening, it sounds like unwanted fallout from https://bugzilla.mozilla.org/show_bug.cgi?id=792438 that should be fixed, so I'd be very interested in it.
I'm not using a packet sniffer. I use a simple web server that is asynchronous. I judge what the browser (FF 21.0a2) does based on the requests I see coming in and being logged and what I see in the Web Console.
Here's the server-side log output::
[I 130403 08:32:39 web:1462] 200 GET / (::1) 4.41ms
[I 130403 08:32:39 web:1462] 200 GET /static/style.css (::1) 7.53ms
[I 130403 08:32:49 web:1462] 200 GET /static/script.js (::1) 10001.42ms
[I 130403 08:32:49 web:1462] 200 GET /static/photo.jpg (::1) 1.12ms
(notice that the /photo.jpg wasn't loaded till 08:32:49 which was *after* /script.js) was loaded)
The output on the Web Console is also interesting: http://cl.ly/O1Zl
Again, notice the 10 second delay till the /photo.jpg is loaded.
Anything else I can do to help with debugging this?
Hrm, it's well known that JavaScript blocks the DOM from downloading anything else:
https://developers.google.com/speed/articles/include-scripts-properly
Not CDN'ing the JS might be a fair call. But for CSS assets it increases the HTTP pipeline.
I do not deny that CDN'ing the CSS assets can increase the HTTP pipeline. The point is that sometimes that additional DNS lookup can be hazardous.
If it's not just outright slow, like I showed in 2009: http://www.peterbe.com/plog/slow-dns it could be yet another thing to go wrong like I showed above where rackcdn.com and contextly.com were both down rendering the site useless.
Not really a big concern for lots of people, but in China, I often wait 2 minutes or so for big websites to load - because it's waiting for the Twitter/Facebook social sharing scripts (which are blocked) to timeout before loading the rest of the page. It's annoying.
Wow! That's really not supposed to happen. Twitter and Facebook usually use async script tags.
unless something's changed in their new code, generally if I visit a blog with the 'share' bar near the top of the HTML, it blocks the rest of the page for a few minutes... I wish there was a Firefox applet to 'Skip' :)
> Sure, that fancy CDN might be faster at serving up stuff than your server but looking up a CDN's domain is costing one more DNS lookup
In theory, perhaps. Real-world tests show otherwise often enough.
https://github.com/h5bp/html5-boilerplate/pull/1327#issuecomment-14927169
TR;DR Browsers are capped at ~8 connections/hostename, but can handle ~12-16 connections at once. & those CDNs may be faster then your shared hosting, which will offset the DNS lookup (if that is needed).
I agree that most scripts should be in the footer; jQuery animations I find in many WordPress templates are perfect examples. However, sometimes JavaScript is used to pull & display the main content, so top-loading is the only way to reduce content load-times. As I explained above, it MAY be faster to early-load a few scripts.
YMMV; you have great warnings Peter, but best suggestion is "test for your self" for optimal performance!
Github (Githulp Help at least) is guilty too