2 Oct 2000 Adrian   » (Master)

As far as the question of "who needs more than 4gig of ram", the answer is apparently "lots of people".

It was a pretty common request while I was working in tech support. As often as not, it was people looking for >4gig support per process, and at least with 2.2, x86, and linux, that wasnt really an option. But it wasnt uncommon for someone to want/need to be able to malloc a half a dozen gigs of ram. And not just malloc it, but use it as well.

But very often, people just needed more memory available because of huge numbers of proccess running. Web servers running on big hardware with long lasting cgi/asp/*let proccesses were a common theme. Ie, a few thousand perl processes taking a couple of megs each on a single machine. Yes, people really do that.

Or perhaps, you just need to serve up several thousand hits a second with a single machine. TUX and 32gigs of ram to the rescue... Okay, so I dont know anyone actually doing that in production. But one thing I learned in tech support, is if there is a limit on something in linux, someone will run into it.

Latest blog entries     Older blog entries

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!