Peter Zaitsev wrote an interesting item on front-end performance of a website.
I’ve always tried to look at the front-end from the user perspective, rather than purely technical. Once you weed out what’s not really necessary for the user, and also deal with issues like “how important is it that this number is live”, you generally look at a fairly different site already ;-)
Before my time at MySQL, I wrote a little gizmo called Measly Mouse which leads a modest but still active life. When reading on from here, please remember it was designed in 2001 and hasn’t really been changed since.
Measly Mouse retrieves a page and deals with redirects, CSS and other includes like images, and tries to apply some basic metric to see how sensible the page is. Basic usability testing shows that people cannot choose between more than about 7 or so items. So you can imagine how the brain desperately fails to deal with most websites (website creators often feel everything is so important it must be on the front page), or through training filters out most things than don’t relate directly to what the person is looking for. The other key factor is size. The bigger the page, and the more includes, the longer it takes. That’s annoying. So the Measly scoring formula is as follows: BYTES + ((REQUESTS + LINKS) * 1024) where:
- BYTES – is the total page content including stylesheets, images, etc;
- REQUESTS – is the total number of requests required (including redirects) to get all page content;
- LINKS – is the total number of clickable items on the page, including forms fields.
Naturally all pages have some includes and some links, it’s just a matter of finding balance to keep the site usable. Some sites use so many redirects…. nutty.
At the time there was quite a lot of debate on the simple methodology and the owners of some sites got pretty upset when someone ran their front page through Measly and it ended up in the top 10 ;-)
I still reckon the concept has merit though… please do make your own judgement and feel free to comment.
You may find some aggregating (like mailing list archive) sites in the top 10… I personally take those entries less serious, because they’re generally focused on a niche (geek) group which does not conform to a general user profile. Still, it’s quite possible their user interface could be improved!
As to how the tool works… it actually parses the pages in PHP using regexes (again remember the time it was built). Although it still works reasonably well, could be vastly improved now and catch more of a modern page.
But what would be really great, is if someone would care turn Measly Mouse into a Firefox plugin. Inside Firefox you have clean access to a page, so the analysis becomes extremely easy. For any page, the plugin could calculate the Measly Mouse (MM?) score, and perhaps optionally submit it to a central location. Who would like to pick this up?
Hmmmm… are you kidding? Doesn’t YSlow do what Measly Mouse does, but than a bit better?
I mean, Measly Mouse doesn’t include CSS and JavaScript links as content to retreive. Also within CSS there can be images which are not counted either.
I think YSlow does a fair job on reflecting what the browser has to do to retreive the complete page.
OK, YSlow doesn’t give you an ‘index’ as a result and maybe doesn’t count the outgoing links on a page. But maybe you can inform the people of YSlow and hint them with this idea.
– Unomi –
You are referring to http://developer.yahoo.com/yslow/ (also http://www.codinghorror.com/blog/archives/000932.html) which was developed early 2007. You may recall from my post that Measly was from 2001.
Measly does include CSS and JavaScript retrieval, both in terms of links as well as in filesize. Of course, being a handcrafted parser, it may miss some items; I also noted this in my original post.
YSlow does give a kind of score, but not quite along the same lines. Indeed, it may well be possible to merge the Measly idea into YSlow, which as you rightfully say already has a plugin available.
I don’t think your algorithm is accurate. My page is simple, standards compliant and doesn’t even use a back-end DB. Even the image galleries on my site load faster than this blog post I am replying to did. Any idea how a static site of mostly local URLS ended up being ranked #1? From the people I conferred with (and I myself am a developer), it seems a little strange and inaccurate.
The algorithm is in my original post as well as the Measly page. The stats Measly picked from your site are also visible in the listing. And, while Measly analysis it shows exactly what it’s doing. So you can work it all out, right?
Do also look at Yahoo’s YSlow plugin for Firefox, and see what it comes up with!