Pagetest web page performance test is a great tool for doing what Firebug does but not in your browser. Pagetest can do repeated tests to iron out any outliers. An alternative is Pingdom tools which has some nifty sorting functions but is generally the same thing.
So I ran the homepage of my website on it and concluded that: Wow! Half the time is spent on DNS lookup!
The server it sits on is located here in London, UK and the Pagetest test was made from a server also here in the UK. Needless to say, I was disappointed. Is there anything I can do about that? I've spent so much time configuring Squid, Varnish and Nginx and yet the biggest chunk is DNS lookup.
In a pseudo-optimistic fashion I'm hoping it's because I've made the site so fast that this is what's left when you've done all you can do. I'm hoping to learn some more about this "dilemma" without having to read any lengthy manuals. Pointers welcomed.
Comments
Post your own commentWell, it's pretty fast already. It loads in 0.5-2sec on my machine. I think any further effort spent on optimisation will have very little impact of the user experience.
I wish mine was half as fast...
It would be very interesting to know how you got this far in optimizing it.
Could it be that you have your NS service from MyDomain who has many many customers on their servers.
This delay should only happen when your nearest DNS don't have the domain cached.
You have 900 seconds TTL on www.peterbe.com, which means that local DNS servers needs to ask mydomain again quite often...
BTW I tried it on one of my static sites and got a lot of bogus results. But the DNS resolving time for my .se address hosted on my own TinyDNS was a third of yours with the same UK testserver as you used (I think). Maybe is .SE faster than .COM also.
BTW2 We meet @ Euro DjangoCon where we had a short lunch together one day at the conferance.
A useful URL, thanks Peter. I was able to make some highly effective tweaks. I'm in the UK too and so is my server. My DNS is served by Zoneedit.com and it's resolving for me in 0.15s.
Hi Peter,
Do you have nscd (Name Service Cache Daemon) installed on the machine? I guess it will help
It's the "clients" DNS server who could make use of a DNS Cache, not Peters....
FWIW, caching is disabled on the test systems on purpose but it will leverage any caching done by the ISP or further upstream (which is where the TTL becomes important).
One thing to be careful with is that the UK location for WebPagetest is running with minimal to no latency (direct connected to high bandwidth data center connectivity). Odds are that for real users on DSL or other connectivity (where 50ms last-mile latency is not unusual) the bottlenecks may move and using an image sprite for a bunch of those images will help.
We're working on making the connectivity in the UK location more like what we have in the US and NZ locations where the connectivity is more in line with a consumer experience.
Not sure if this kind of thing is what you're looking for, but Jeff Atwood posted somewhat recently about their change of DNS Provider for StackOverflow (here: http://blog.stackoverflow.com/2009/09/new-dns-provider/) because of DNS instability and response times. Might be a good jump-off point for great reading about speeding up your DNS.
Dude, I'm in south america and your site feels blazing fast (actually much faster than many local sites). So, no worries.
Thanks for the pointer to webpagetest.com, that's a good resource!
Webpagetest is a great and very comprehensive tool that has a great options. Pingdom is its alternative, but there are also two more tools that are checking the performances of the website and both uses gps and yslow:
http://gtmetrix.com
http://tutor.rs
The first two images now give a 404. The third one works just fine for me though.
Sorry about that. About a year ago I rewrote my whole blog from scratch and migrated all the data from an object database into PostgreSQL and some content was not part of the database nor cached so I couldn't pick it up.