Computer Measurement

Posted 1 Aug 2002 at 15:11 UTC by sye Share This

I was reading literature on Computer Measurement Group site. One article is by Jing Zi of Keynote Systems. "Statistical analysis of Web page download time measurements suggests that some relatively simple formulae can be derived to project page download times based on Web page composition and TCP connect time for a browser/server pair"

As an active member on advogato & badvogato, i find it is interesting to write a formulae which will derive web page composition ( including weirdo 'illegal code') from the page download time. Or to write a formulae to derive TCP connect time for a browser/server pair from all other known measurements. Any further thoughts on the subject?

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Share this page