Older blog entries for pphaneuf (starting at number 275)

10 Dec 2006 (updated 10 Dec 2006 at 20:08 UTC) »

Significant impact for a supposedly virtual world

Nicholas Carr wrote about how much electricity does a Second Life citizen use in a year, trying to estimate the electricity consumption per virtual citizen. It comes out close to the Brazilian average (note this is just electricity, not considering cars, heating oil, etc). Translated in terms of CO2 production, it comes to 1.17 tons of CO2 per virtual citizen (a total of 14,763 tons overall), equivalent to driving an SUV for 3,700 kilometres.

It has been pointed out that the consumption of the users' computers is not taken into account, and that the amount of power needed for cooling has been underestimated as well.

Someone commented that using rates commonly found in Texas, it would come out to about $1,500 per server annually, so it's not only environmental, this also comes down to real money. This, I find very encouraging, because it has the airs of a concrete implementation of true-cost economy. Even though it's really far from being true-cost, it gives a plain old capitalist incentive to go easy on the environment that's becoming harder and harder to brush aside.

When you buy appliances these days, you generally see a label rating their energy consumption relative to other devices in the same class, and I don't know about people in general, but I try to pick the A or A+ models, not only because it's the right thing to do, but for the rather self-centered reason that it'll be easier on my bank account.

I hope to see something similar for computers, and hopefully see people and businesses favour environmentally (and thus, financially) friendlier machines.

Epilogue: Think of the World of Warcraft servers now.

Syndicated 2006-12-10 16:20:24 (Updated 2006-12-10 19:41:22) from Pierre Phaneuf

Stabby...

I'm so tired. I feel like a whiny bastard, and I can see myself coming up with how things are better back home or shit like that.

Some people have asked me why I didn't go back, then? Well, the inertia works both ways around here, you see? I'd be packing up, but just how I had so much difficulty finding an apartment, I have this insane contract where I'd have to lick the pavement from here to Bordeaux in order to get out of it.

But I have to admit, some mornings, I'm bloody tempted to just tell all of this to fuck off, take my plane ticket and head back.

In the meantime, I'll be over there, writing the plan for the plan that'll allow us to plan. But I wouldn't expect much result or effect...

Syndicated 2006-12-08 12:27:25 (Updated 2006-12-08 12:58:29) from Pierre Phaneuf

I need some REST (sorry, it's really a terrible pun)

apenwarr mentioned components recently (among other things). Just wanted to mention that, as a designer of a component system like COM, yeah, Unix pipelines are indeed a component system, and the only truly successful one, at that!

COM also did okay as part of ActiveX and its OCX predecessor, the Visual Basic components being the only one that succeeded as a market, which isn't bad, as that was supposed to be the whole point. But it never came down to the people the way Unix pipelines did (for some value of "people").

While I agree with most of the rest of his post, I think he got REST slightly wrong. REST isn't easier to index than SOAP, and would need similar plug-ins to do so. A REST protocol could be designed to provide this consistent reliability, not mangling data. But REST is just a "style" of web services, comparable more to things like "message-passing" or "remote procedure calls". It sort of makes his point correct anyway, since there's no "REST protocol" to "beat" the SOAP and the WS-* ones.

Syndicated 2006-12-06 20:49:10 (Updated 2006-12-06 20:53:25) from Pierre Phaneuf

Getting the immuable ball rolling

Big surprise, I'm depressed some more. It's difficult to pinpoint any one source, but there's this general feeling of insurmountable inertia that would put the biggest and heaviest icebreaker to shame. It's in the heavy bureaucracy, the attitude of people, the ever-present ads for insurance company, the traditionalist attitude, the enforced politeness, the overbearing smell of fear of change in the air. And here I am, an agent of chaos, of movement, valuing high manoeuvrability and careful instability, in this environment that more or less wants me to stop existing as I am.

I'm the most "normal" I've ever been, even as a child, and yet, I feel like I'm a complete weirdo. I used to take trips hundreds kilometres away at a whim, and now the most spontaneity I get to express is getting up to get some pudding from the fridge.

Syndicated 2006-12-05 18:37:02 from Pierre Phaneuf

The Rose of Versailles

I dreamt I was making out with Lady Oscar. Weird, but not really surprising, I guess...

Syndicated 2006-12-05 07:35:17 from Pierre Phaneuf

I had no idea it was that bad!

I was checking out a review of new Xeon hardware, which was using MySQL 4, MySQL 5 and PostgreSQL 8.2 beta. The test is mostly realistic, consisting of PHP pages doing queries, taken from their own website, with an exception, that the dataset is restrained to something like 2 gigabytes, so that it doesn't hit the disks too much (this can make it either unrealistic, or is a strategy to stay sane).

The article mostly talks about the hardware, but the results put MySQL in, uh, a "bad light". MySQL 4.1.20, going from a one single-core processor to two dual-core processors (that's between twice to four times the horsepower), goes 56% faster. MySQL 5.0.20a goes 40% faster. PostgreSQL 8.2-dev (okay, dev, whatever, have you looked at MySQL recently?) puts in a 224% increase in performance, meaning that it actually gets the "slightly more than three times as fast" you'd expect from having "between two and four times the horsepower".

They also show graphs of how they behave dealing with concurrent requests, where you see MySQL peak at about 520 requests per second, then goes down as the number of concurrent requests. PostgreSQL, on the other hand, slowly goes up to about 640 requests per second, then pretty much stays there, being clearly limited by the hardware to that level, a textbook example of nice scaling. At a concurrency of 100, PostgreSQL pulls in nearly twice as many requests per second than MySQL.

So much for MySQL being "the fast one" (PostgreSQL was traditionally the "safe and correct one", which was already nothing to sneeze at, as long as it managed to be "fast enough").

Syndicated 2006-12-04 10:54:17 (Updated 2006-12-04 10:54:43) from Pierre Phaneuf

A sign of the times

As apenwarr is demonstrating, it seems linuxcentrism is the new vaxocentrism.

I find it especially ironic that he sees BSD as an "imitation Linux" when, historically speaking, it's the other way around. I won't argue that it's where it's at nowadays, but sometimes, the old fart knows a trick or two, still.

Syndicated 2006-12-02 14:47:48 (Updated 2006-12-02 14:57:33) from Pierre Phaneuf

2 Dec 2006 (updated 2 Dec 2006 at 17:05 UTC) »

On being a (crappy) Mac zealot

[info] caffeinemonkey said, a long time ago, a few things about Mac OS X...

I don't really have much preference between Xft and Mac OS X anti-aliasing, if only that the latter works right away (but this advantage is fading away very quickly).

His comments about playing video, I find a bit strange, because that's one of the first thing I was doing nicely with my Powerbook, playing DivX on the television while doing other stuff on the main screen. Of course, I didn't use Quicktime Player, which wouldn't even play fullscreen and I instantly recognized as useless payware, trying to get you to buy it every other click. I was told VLC was the "plays everything just fine" player for Mac OS X, put it on, and never looked back. And it lets you choose which screen you want it playing in fullscreen mode quite easily. Ok, so there's this funky FrontRow thing, but I'm not at that point just yet.

And like apenwarr, I listen to my music, rather than watch it. But unlike him, I actually uses my laptop to play video, because it's one of the only machine I ever had where it actually worked. I scoff at people wrangling with their Windows video configuration to try and get their TV-out working, where I simply plug it in and it works. Ha!

Also, while I'm a bit annoyed at the slightly out-dated shell environment (hey, at least they're using bash nowadays!), and I won't hang around in it too much, it's a reminder that the world isn't all Linux. Believe it or not, there are still some poor schmucks stuck with Solaris, HP-UX or some other even creepier things, like Windows! *shudder*

I don't use the mouse that much, being more keyboard oriented (I make a poor Mac zealot, at times). When browsing, I keep my left hand on the keyboard anyway, Quake-style, generally to close the current tab quickly, but as a side-effect, I have my thumb on the Command key, so opening in a new tab is quite easy. I actually suspend and resume a lot, as I often use my computer the whole day, a little at a time, when I'm away from power. I love that I can open it and that it comes up quickly enough not to lose my train of thought, so I can look up something, then close it right back.

I use Aquamacs, which can be quite questionable at times, but I manage to do a fair amount of damage with it. I've always tended to use few windows, so the two-layered window management isn't that bad (I actually like that the arrows work, once you go in "alt-tab mode", they don't in most X11 window managers, and I miss it).

Mind you, I have a number of gripes about this stuff, but you know me, I've got gripes about everything. But they're not killers, being minor annoyances at worst (compare this to [info] azrhey 's "Linux laptop", Ubuntu Edgy won't use its wireless card, despite it being supported?). It's not a fantastic development machine, I'd still go with a good Linux desktop workstation for that (the Xcode tools are very sweet, on one hand, but man, where's valgrind?), but it's a nice environment for day to day stuffing around. It works, well enough.

Let's say I generally agree with Tim Bray on that, feeling that Ubuntu and friends are breathing very close to the back of their neck!

Update: Of course, speaking evil about Ubuntu caused [info] sfllaw to come out and point out that I'm actually a complete moron when it comes to computers. It's amazing that I'm able to figure out how my pants work. When I come to think of it...

Syndicated 2006-12-02 13:17:14 (Updated 2006-12-02 16:49:02) from Pierre Phaneuf

Choo Choo!

Booked some tickets on the TGV for Christmas, to visit some of [info]azrhey's family. I'm looking forward to finally trying this out!

I'm pondering ideas for projects, and I think I've got one or two things that could be promising, so this is rather encouraging. I'm still having more questions than answers, of course, but everything in its time...

Syndicated 2006-11-30 12:39:08 from Pierre Phaneuf

Change is the only constant

Philip Van Hoof (also known as my evil twin, not so long ago, thought his Tinymail API was frozen, but ended up having to change it anyway.

It's been said many times, but there it is again: change is the only constant.

When I tell people about XPLC, they often think that it's about dynamically loading plug-ins, extending applications at runtime. But it's not really that. What it is about is interfaces.

It just so happens that interfaces are a requisite for plug-ins, but really, they are everywhere. An API is an interface, as is an ABI. It's just that the first obvious use for an interface is plug-ins, extensions, or whatever dynamically loaded piece of code.

Some project are very aware of their interfaces, glibc or libpng for example. The former uses all sorts of ELF tricks (like sonames and symbol versioning) so that code linked against an earlier version can still work correctly with a newer version (this is called backward compatibility). The latter is careful to never remove or change the semantic of existing functions, only adding new functions while providing a way of knowing whether the functions are available for your use. Some, like GTK+, have taken different approach, where they made sure that incompatible versions can be installed in parallel (the downside being that a bug fix in the newer, still supported version, will not fix applications still using the older version).

What does XPLC bring to the game? It helps manage all of these things, in a portable way. Those ELF tricks I mentioned, they do not work on platforms that do not use ELF, like Mac OS X or Windows. Also, the technique used for defining interfaces not only allows backward compatibility, but also forward compatibility, where an application that only uses a subset of an interface can easily use an older version of the interface (that is still supported by the library implementing it, of course) on purpose, so that its runtime requirements are lesser. This is not possible at all with the ELF symbol versioning, for example.

A library providing its interfaces using XPLC can even make incompatible changes in its interface in a new version (such as changing the semantics or replacing a function by another), while still maintaining backward compatibility, admittedly at the cost of providing adapter glue. Some people seem to think that you have to provide this glue, but it is left at the discretion of the library's implementers (consider it's also possible to stack adapters, making things work, but likely an extra cost in runtime efficiency). If we think back to the GTK+ approach of parallel installation, this means that fixes to the library will fix the users of the old interface just as much as those of the new one. There is a downside, of course, in that this adapter glue is more code, and that more code brings "opportunity" for more bugs, but still, the glue can also be fixed in a newer version, without breaking the interfaces.

When ones considers the possibility of security bugs in older, unmaintained versions of a library, the only two ways would be to update all the applications to use the new version of the interface (which is impractical, for clear reasons, anyone needing a reminder can only look at XMMS, still widely used, despite many attempts at killing it) or to keep on updating the older version. With XPLC, the adapter glue can be fixed, and since the applications are, in fact, using the new version of the library, they can actually migrate to the newer interface incrementally, so it can be done without major disruption or the heavy burden of conversion work.

There are also sometimes optional modules in a library, that are controlled by compile-time options (to the "configure" script, for example). XPLC interfaces are discoverable, meaning that you get the best out of both strong typing and dynamic typing: you can explore the interfaces an object implements dynamically, and when it tells you that it is indeed supported, it is a strong, binding contract.

So, in summary, with XPLC interfaces, one can get:

  • A stable ABI.
  • Backward compatibility, even through "incompatible" changes.
  • Forward compatibility, allowing application writers to easily target older version.
  • Opportunity for incremental migration from older to newer APIs.
  • Discover optional modules at runtime, without requiring extra shared objects.

All of this is in the simple case of a single implementation of a given interface, but as Philip is demonstrating, it only starts there. He uses interfaces to let users of Tinymail provide their own alternative bits and pieces, mixing and matching as the task at need requires. Not to mention unit testing, language neutrality (including scripting), and so on...

Syndicated 2006-11-29 21:22:14 (Updated 2006-11-29 21:28:05) from Pierre Phaneuf

266 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!