Older blog entries for raph (starting at number 406)

auto* delenda est

David Turner (freetype)'s recent post in response to titus reminded me of my own auto* aversion. In sum, I think auto* represents everything that is bad about a free software project.

Don't get me wrong, auto* was (and still is) a tremendous improvement over the bad old days of hand-editing makefiles just to have a chance of having your software build. But it is well past time to have designed, implemented, and deployed a better alternative, and I don't see too many good signs of that.

What's wrong with it? Let me enumerate the ways:

1. It's way too complicated. Good software and free software, not to mention good free software, run on simplicity. auto* does not have this quality.

2. It's implemented in bad languages. One bad language would be enough, but M4, (portable) make, and portable shell? There's a good reason nobody else has even attempted writing an app in that combination of languages.

3. Original goals are no longer very relevant. In the bad old days, there were lots of vendor Unices and other strange build environments. Today, in the *nix world, there is just the GNU toolchain. The amount of actual diversity that needs to be configured around is minimal.

4. It doesn't solve real-world portability problems. For many users, getting programs to build on Windows is at least as important as compiling on an ancient MIPS running Ultrix, yet auto* isn't much help at the former.

5. Bad error reporting. In "configuration science", one of the overriding goals should be production of clear and meaningful error messages.

6. Lack of overall systems thinking. Much of what auto* does is work around limitations in tools such as sh, make, ld, package managers, and the like. If some of these other components are better places to solve configuration and build problems, let's do it there rather than twisting ourselves into pretzels trying to work around them. Apple had the guts to extend ld in several important ways, including two-level namespace support. Why are we still stuck with the clunky late-'80s approach copied from old vendor Unices?

It's been clear for a long time that CVS needed replacing, and now we have a variety of great alternatives, some of exhibit that classic, simple, do-one-thing-and-do-it-well free software philosophy. We should have something similar for the problems that auto* solves.

Japan

I find myself posting from Japan once again. Why is it that I'm more likely to find a free moment here than back at home in Berkeley? Anyway, it's nice and cold, and I even got to see some snow up north in Matsumoto.

The happiest baby on earth

I took a picture of Alan when he was a baby, and for a while it was one of the first-page hits on Google Image search for the keyword "happy". Over time, several people have asked to use the picture.

Most recently, it graces the front page of UC Riverside, where it is used to illustrate a research study on the nature of happiness. That makes it official, when he was a baby he was the happiest on earth.

The funny thing is, the day I took that picture, he was also most unhappy. We were packing for a move, and he was quite cranky that we were paying attention to all these boxes and things instead of him. I took a break for a few minutes, and he was soooo happy, I decided to take a picture. He was still happy to be the focus of attention, and I think the pic shows that.

Nokia 770

rillian brought his Nokia 770 when visiting here, and it seems really cool. The kids liked it, as well - Alan surfed to the neopets site and was able to log in, and Max made a drawing with the sketch (primitive paint) app.

My take is that the form factor is a winner, but I think I'll wait until the second generation to actually get one. The CPU is quite pokey by modern standards, and memory is tight.

It does run Ghostscript right out of the box, though! It seriously looks like it's a lot easier to develop for than your usual handheld.

More trust metric

I got a gratifying response to my trust metric rant in the last post - a couple of emails, some blog comments. It's clear now that I need to do a more detailed writeup of exactly how to implement the eigenvector-based trust metric in the context of a large Wiki.

Pete Zaitcev writes: One half is spam and abuse, and other half is that conventional, highly credible and trusted wisdom is simply wrong. I'm not sure exactly what he means by this, but it may have something to do with the fact that, from the perspective of approximately one half of the population of this country, approximately the other half is under the spell of a mass psychosis in which the usual rules of reality simply don't apply anymore.

It's not clear to me how a large wiki should handle this situation. One intriguing possibility is that the subgraphs of sane people and deluded people both form cliques, so that when a sane or deluded person is logged in and the trust metric is computed from their node, they see a version of the page that is factual and objective, or conforms to the parameters of their delusion, respectively.

The Clever search engine from IBM research has an interesting take on this issue. While PageRank and the Advogato trust metrics compute the principal eigenvector, they also compute some of the others, resulting in "clusters". They report, for example, that the second eigenvector link graph for webpages on abortion neatly separates pro-life from pro-choice. Indeed, this very eigenvector is likely to correlate very strongly with the sane/deluded distinction described above. The sign of this correlation is, of course, left as an exercise for the reader.

Teeth

My teeth are a bit sore. Turned out the cavity I was to have filled today had more decay than expected, so I get to have a crown instead of a filling. Could have been worse, it didn't go into the nerve, so I don't need a root canal.

Time for a website for free font development?

With all the recent activity in free font land, I decided to set my ideas down "on paper", and posted a thread over on typophile. I chose to post it there rather than here because I want the input from people in many different communities, especially type wonks. If you're interested in free fonts, whether as a user or as a developer, head on over and add your 2 {(euro )?cents|pence|yen|whatever}.

Time for a trust metric enabled wikipedia?

I see that Wikipedia is having some well-publicized troubles with vandalism and the like. This will be a somewhat bittersweet response.

The success of wikis has taken a lot of people by surprise, but I think I get it now. The essence of wiki nature is to lower the barrier to making improvements to content. The vast majority of access-controlled systems out there err strongly on the side of making it too hard. The idea of a wiki is to err on the side of making it too easy, and to lessen the pain (somewhat) of undoing the damage when that turns out to be a mistake. In cases where that doesn't work out, I think the solution is to make the decision process of whether to grant write access a bit more precise, so you can still err on the side of trusting too much, but you don't have to err quite as often or as badly.

In that regard, the trust metrics designed and implemented for Advogato are a near-perfect match for the needs of a Wikipedia-like project, but for the most part, nobody is paying much attention to my ideas. Yes, I am bitter about that. I've written them up in a howto and some draft papers, arguably not as polished a presentation as the ideas deserve, but still comprehensible to somebody motivated to understand them. I've implemented them and released the code under GPL. That implementation is too tied to the somewhat quirky mod_virgule design, but adapting and modifying is what free software is all about, no?

So I haven't exactly gift-wrapped the trust metrics and presented them to the world on a silver platter, but they're not sitting at the bottom of a locked file cabinet in the basement of the local planning commission either. With Google now worth a brazillion dollars, due in large part on the success of their eigenvector-based trust metric, and with the problems of spam and abuse showing few signs of just going away on their own, you'd think there'd be more interest in creative, high-tech solutions to the problem.

Let's say for the sake of argument that there's a 50% chance that I'm a raving moron when it comes to this stuff, that my belief that a trust metric would go a long way to solving problems such as Wikipedia's is just plain wrong. Say there's also a 50% chance that there are practical problems I don't forsee, so, while the basic ideas might be valid, they just won't work on a project like Wikipedia. Of course, you can dispute the exact numbers, but that leaves something like a 25% chance that it really would be worthwhile for someone to invest the time and energy into making it happen. How much is Wikipedia worth to people? How much is the idea of decentralized collaboration, especially so that you don't have to rely on "content serfdom" to get the good stuff?

Of course, the Free Software Way(TM) would be for me to pick up a shovel, dig in, and implement a trust metric enabled wiki myself. Well, pardon me for ranting, but in this case I believe the FSW is just plain dysfunctional. A large part of the reason I'm reluctant to invest much more of my own time and energy is the tepid reaction to the work I've put in so far. How is it that a community can generate dozens of IRC clients, me-too distributions, window managers, and PHP bbs engines, and yet leave the development and implementation of the Advogato trust metrics almost completely ignored?

Wow, even I am amazed at the intensity of that rant. I did say this post would be "bittersweet", but so far it's been pretty much all bitter. The sweet part is basically that I have faith that, in time, the Advogato trust metrics will be understood and implemented as widely as deserved based on their ability to resist abuse. Free software development, in particular, operates on a pretty slow clock. My last post contains a striking example - the roughly many year lag between my release of a prototype watercolor simulator and the inclusion of the ideas in an actively developed app.

And already, I see some tentative signs of that. The Wikipedia development boards have some discussion of "trust metrics," although I don't see much evidence they actually understand the power of Advogato's. Additionally, there is some academic work starting to build on my own, including Paolo Massa's evaluations of the various extant trust metrics, and Daniel Stewart's "Social Status in an Open-Source Community", published very recently in the American Sociological Review.

And who knows, maybe even this post, despite the bittersweet tone, will inspire someone to take another look at my trust metric ideas. Hopefully somebody who has the technical ability to implement something a little more sophisticated than the usual PHP hash, and whose idealism about free culture and individual-centered web content has not become quite as jaded as my own. If someone out there were to do a nice job implementing an attack-resistant wiki, that would do wonders for reinforcing my faith in the community.

4 Dec 2005 (updated 4 Dec 2005 at 01:53 UTC) »
Yay, I'm a pro font designer now!

As hinted at last post, TUG has awarded me a grant to complete and release Inconsolata, the monospace design I'm working on. Go take another look - all of ASCII is complete now (but will certainly be refined over time), and there's an OpenType download for Mac and Win readers out there (this should answer fxn's difficulties with trying it).

I find that, of all potentially relaxing creative activities, I enjoy font design the most. I've tried learning some musical instruments, drawing and painting (including a couple classes at Berkeley), and a few other things, but usually I'm just not that good at it. Or, sometimes, it feels like I could do good stuff but it takes a lot of mental effort and concentration. I find that I can draw a glyph or two even when I'm feeling cranky or tired, or that my mind is just not working. I expect to be spending more time on fonts.

The nature of collaboration in free software

I'm often disappointed or frustrated by the lack of collaboration I often feel in the free software community. Of course, a good deal of the fault lies with myself - given any kind of tension or conflict, my natural reaction is to go into hiding. I simply don't have the characteristics you'd find in a natural-born leader.

But one of the best things about free software is that it often lets collaboration happen in roundabout ways. Take my work about five years ago on watercolor simulation. I wrote some code and posted it, and have been thinking on and off since then about how to optimize the algorithms so you can get real-time performance on standard hardware. But I've never actually done that, or packaged up the code I have into a real, usable painting app.

Now, it looks like somebody else is. I got an email today from Bart Coppens asking license clarification to use my code in Krita, which looks like it's developing into a real contender in the space occupied now by Corel (formerly Fractal Design) Painter.

Of course, there is more than a bit of irony here, as some would argue that the development work put into Krita could have been better spent adding similar features to Gimp. But it just doesn't work that way - there's nobody paying the Krita people to do that, and no doubt they're having more fun doing things their own way. There are always decisions to be made differently - choice of programming language, for one thing, so that letting code adapt and even be rewritten is usually the most realistic way to let it live.

I think the same can be said of much of my free software work. Libart isn't being developed, but projects like Cairo and Inkscape are that much richer for having had Libart as a model. That's not hugely gratifying (especially when there are Advogato posts gloating about what a great thing it is to switch from Libart), but all in all it's a contribution I can be happy about.

Fonts

A bunch of things are happening in font-land.

For one, SIL has released version 1.0 of their Open Font License, and promises to be releasing Gentium under its terms shortly. The fact that SIL is getting aware of free software licensing is very encouraging, as it promises to make their efforts considerably more relevant.

Even so, I'm not convinced that the OFL will have that great an impact. Earlier drafts tried to ban selling collections of fonts with OFL fonts included, but apparently that ran afoul of DFSG-style freedom. Now, apparently, it allows selling of collections, but not of the individual font. Was anybody actually selling free fonts individually before? Even if not, the adoption of the OFL may send a signal that the font is to be treated with more respect. As Wes Felter says, it's much like wearing a designer t-shirt. It will be interesting to see how aggressively the "free font" ripoff artists prey on Gentium - if they do back away, it might be an appealing example to follow.

I'm tracking this because I've got a few fonts in the queue that I'd like to release under some kind of free license, but am still unclear exactly what license is best. I've been in touch with Karl Berry having TUG sponsor completion of one or more of the fonts, and the choice of license is still an open issue.

Font fans might be interested in taking a look at my latest font-in-progress, Inconsolata, a monospace design. I'm hopeful that it will turn out to be one of the best available for code listings, etc., in print.

Japan

The trip to Japan was really fun. On the last evening, I had a very nice dinner with Masatake Yamato and Akira Tagoh, both now of Red Hat Japan. We talked of many things, including areas where recent AFPL releases of Ghostscript may break some of the work done by the gs-cjk team to make substitution of Japanese fonts work correctly.

We also talked about free software tools for Japanese learners, and input methods for Emacs in particular. I've been using Quail, mostly because it was easy to find since it's included in Emacs distros, but apparently SKK is better.

One question I have about Quail: is there a way to go in the reverse direction: if I have a kanji in the buffer, can I make it tell me the key sequence required to produce that?

15 Nov 2005 (updated 15 Nov 2005 at 06:00 UTC) »
Hello from Tokyo

I'm posting this from the Manboo comic library and Internet Cafe in Tokyo. It's probably the most uniquely Japanese experience I've had here. After all, cities are cities, and most name brands are global. Given a choice, I'd probably rather go to Fry's than the famed Akihabara district. But this is a concept that would probably only work in Japan, and definitely not in the States.

Basically, the deal is that you pay around $3/hr, which gets you a private cubicle with your own computer, TV, and PS2. Not only that, but you get free run of an impressive library of manga comics, free drinks, clean bathrooms, and a handful of similar perks.

Now, keep in mind, by Tokyo standards, that's an incredible deal. This glass of iced tea cost the company something around $8 at the Tokyo Hilton, a ten minute walk or so away. Refills not included.

The main reason it wouldn't work in the states, I think, is that people just wouldn't respect the space. They'd be stealing all the books and equipment (there's a decent pair of headphones hanging on the wall, no bizarre incompatible connector or other "security" mechanism to keep it there), defacing things for the hell of it, pissing in the cubicles, shooting up (although, truth be told, there rather is a distinctly herbal aroma to the cigarette smoke in here).

Meetings

We had three meetings with three Japanese companies. Two went very well, one was near-disastrous. (I won't name the companies out of discretion)

Doing business with Japanese companies is very difficult for Westerners. There's all of this culture, and what would be a straightforward comparison demo of technical skill can be interpreted as an insult to the engineering capabilities of the host. People talk about "honne" and "tatemae" in terms of great mystery, as if it's impossible for Westerners to grasp, but it's not really that hard.

Take for example, when, at the really fancy Japanese dinner we were treated to at a restaurant in Matsumoto city on Friday, they offered a shabu-shabu of the male reproductive organs of some big fish, nobody was sure exactly what kind. Everybody's looking at me to see what my reaction will be.

So here's honne: "bleaghh, this thing tastes weird, and the texture is even weirder. I'll be lucky if I can get it down."

And, in contrast, tatemae: "Thank you for offering this experience. It is a very interesting flavor!"

Note that both are, in fact, true. I'm going to get a lot of mileage out of this story, much more so than if we had just had nice steaks or what have you. But the Japanese make the distinction explicitly, and pretty much expect it in daily relations. In a way, that's actually more honest than the American way, which is to pretend that it's all honne all the time, but we do it too. (lots of other gaijin have written about this topic - this one is one of the better explanations. And, of course, for insight into how dysfunctional Japanese culture is from the perspective of an American teaching English in the schools, nothing beats Azrael's blog)

Gadgets

I spent some time walking through Akihabara and just letting the gadget-ness wash over me. In some ways, the technological progress is awesome, but in other ways I'm beginning to wonder if the engine may be slowing down.

On the plus side, digital cameras have finally really arrived. I picked up a Panasonic FX-9 (6MP compact) and am absolutely thrilled with it. Good pictures (I've linked one or two from this entry), cool funky features such as the ability to take movies, and even a rotation sensor. (neither iphoto nor yahoo photo knows how to interpret the tag yet, but I'm sure that will happen soon). It's got a 1GB flash card that looks to me just like a 3.5" floppy scaled down to an inch. I remember my first hard drive, it was 20MB and occupied a 5.25" form factor.

But on the other hand, I sense the magic has gone out of it. Sure, the pace will advance. We'll be able to stick more and more songs up our ass. But a lot of the stuff, computers in particular, lacks much in the way of fresh and new. The majority of laptops here still have 1.2GHz Pentium M chips, although of course 2GHz is still available on the high end. Displays are pretty much the same as a few years ago, just a bit brighter, higher contrast, and faster.

I was also looking for a pocket electronic Japanese/English dictionary, but didn't find a model that really appealed to me. They've got relatively pixelly monochrome LCD displays, cost around $300 for a good model with the kanji dictionary and so on, and none of them have features designed to make life easy for a Westerner trying to learn Japanese (an untapped market, perhaps?) It seems to me you'd be far better off with a Nokia 770 and some dictionary software (perhaps even wiktionary-based, which seems to be gaining momentum).

Windows Media Photo

Windows Media Photo is part of the upcoming XPS format from Microsoft. From what I understand, it basically has the advantages of JPEG2000, but without the problem of people other than Microsoft owning patents on it. We may be starting a crash project soon to implement it from scratch. It's too early to tell whether it's going to be kosher to do a true free software release, but we're in contact with the right folks at Microsoft and are pushing on them. Anyone you know have an interest in image codecs, a taste for implementing specs, and a need for some extra walking-around money?

Misc

Now that I have a digital camera, I'm missing more than ever the ability to insert inline images, so I'll want to add that.

I'm also missing the ability for people to write direct followup comments, so much so that I considered writing this as an article. Hopefully now that the trips have wound down, I'll have some time for that. Hmm, where have I heard that before?

iagorubio: sorry about that. My acm account got thoroughly deluged by spam, so I let it lapse. I've updated my contact information, so hopefully it will be easier for people to get in touch with me now.

Advogato spam

mathrick recently pointed out that some people are trying to abuse Advogato by posting spam, and suggested that I delete or disable the idiot's account.

Of course, I'll do that if it turns out to be a real problem, but in the meantime it's a pretty good test of Advogato's trust metrics, especially the newer one for comment rating. Because of the size-0 profile bug, it wasn't being recomputed in timely fashion, but it's back online now. And, happily, the offending page has a rating of "1" so it's suppressed altogether from the recentlog view linked from the front page (assuming you're logged in).

There are some improvements I really want to make to the rating mechanism, and may be able to get to them sooner rather than later. For one, I want the un-logged-in view to have some reasonable default (maybe mine :) for the trust metric seed, rather than showing the results unfiltered. I also want to do a dynamic trimming thing, so that lower-ranked blog entries get trimmed more aggressively than higher-ranked. Same thing for <img> tags, which are sadly missing from Advo blogs.

The Advogato trust metrics really do work. I must really suck at self-promotion, because it seems the rest of the world doesn't seem to notice or care. Ah well.

Resolution-independent UI

One of the features intended for MacOS 10.4 (Tiger) was resolution-independence in the UI. What actually shipped is pretty much a baby step in that direction, as opposed to something really usable.

Overall, it looks pretty much like what I proposed a year and a half ago. What I called the "pppx ratio" (pixels per px unit) is called the "scaling factor", and is accessible as the userSpaceScaleFactor method in Cocoa, or HIWindowGetScaleMode() in Carbon. Handling of legacy apps is also very much as I propsed. I wrote last April: "If the app is not LQ-aware, then as it asks for a display surface, the OS knows to give it one with 100 dpi "pixels", so that each pixel that the app draws shows up on the hardware as a 2x2 square of pixels". This is implemented in Tiger as Magnified Mode. A Carbon app signals its "LQ-awareness" by setting the kWindowFrameworkScaledAttribute flag at window creation time.

Even so, the results, as shipped on 10.4, aren't that good. Mortal users don't have any way of setting pppx, but developers can just open -a "Quartz Debug", then press apple-U to bring up a slider. I noticed that most apps shipping with the OS have "framework scaling" set, as opposed to just drawing in Magnified Mode, but many have artifacts. A few set the window height (but oddly enough, not the width) based on pixel, rather than px, units, so the window is chopped off at the bottom when the scaling mode is greater than 1. Quite a few (Safari especially) generally work, but have artifacts. A few seem to work well right out of the box (I didn't do a careful study, but Cocoa seemed to predominate here).

Text and vector graphics scale well, but bitmaps don't. And, indeed, all the Aqua UI elements are drawn as pppx=1 bitmaps, then scaled, so things don't look all that sharp. Similarly for web pages shown in Safari - all the bitmap graphics look really fuzzy in comparison to the text. I've said this before and will say it again: the Web desperately needs good standards for providing images at resolutions greater than 96 dpi. I say "desperately" because it needs them today to provide quality printing, but of course once the problem is solved it will work just as well for LQ displays as well.

One interesting note: when pppx=1, the display dpi is nominally 72 dpi, which was true of the original 9" Mac back in 1984. Yet, today, nearly all machines Apple ships have 100 dpi LCD screens. Setting pppx automatically by dividing the actual display resolution (queryable through a protocol spoken along the video cable) by 72 will typically result in a setting 33% too large. One solution is to make sure it always gets set manually (as I wrote a year ago, "The pppx ratio should be a global configuration parameter.") Another is to fudge the calculation. I guess we'll have to wait until 10.5 to see how Apple finally deals with all the usability issues.

Japan

I'm flying to Japan again from Nov 5-Nov 16. I'll have the last few days more or less free, so that would be a good time to meet up with Advogatans in Japan. Let me know (through my gmail address) if you're going to be around then.

More thoughts on GUI tools

I got some good replies to my last post. Unfortunately, the somewhat archaic software hosting this site doesn't have any way of tracking replies and responses, so for the most part I have to work from memory. I think I need to give the author of the software a kick in the pants...

The answer to the question "what is the best approach to writing GUI apps?" depends a lot on a few basic assumptions. I didn't state those at all clearly in my last post, so I'll try to do that now.

The first big question is whether you're trying for something adequate and usable, or for a world-class app that is in no important ways inferior to professional apps designed natively for the platform. This question is especially important when trying to target multiple platforms from the same codebase.

There are lots of good approaches to "adequate" apps, including the venerable Tk (especially with the Tile improvements), wx, qt, or either of the two viable Java toolkits. But it is difficult or impossible to achieve the fit and polish of a real native app. Examples:

  • Seamless integration with the platform's input methods for non-English text.
  • Screen reading and other accessibility items.
  • On Mac, having the dock work as expected (drop items to it, badging).
  • Cut and paste with rich content.

It's not just about being able to draw widgets that look just like native; that's important, but the tip of the iceberg.

If my goal was an "adequate" app, the generic cross-platform frameworks would be appealing. But for the moment, that's not mostly what I'm thinking about.

Another very important axis is the choice of programming language for the app. If your language rocks at strings but sucks at everything else (like Tcl), then obviously having the GUI toolkit treat everything as strings is a good thing (hence Tk). But if your language has really good dynamic dispatch for objects (like Objective-C), then you clearly want your UI events to map directly to object methods, rather than having to filter through big switch statements on split strings. What you really want is Cocoa (or maybe GnuStep).

And for me, the valid assumption is that I'm writing primarily in C, but am also very interested in prototyping in Python. Unfortunately, C is neither really good at dealing with strings (you have to do handstands to get the memory management right) or objects. Other things, like exceptions, are also painful. So the constraints of trying for a reasonably good API for C is likely to warp the design to be not very good in other languages.

So I don't hold out all that much hope for the convergence of GUI toolkits, at least any time soon. Different frameworks for different goals.

A funny, well-written review

Tor found this. It made me laugh hard enough not to be considered appropriate in the presence of royalty.

Still under water

Wow, it's a long time between entries. The main problem has been an incredibly busy travel schedule - almost immediately after returning from the Netherlands trip, I'm on the east coast for a day for a potential customer visit, then tor arrives this evening, then next week is our staff meeting, then maybe there's a followup trip to Japan in early November (for which I want to see if I can hook up with some Advogatans in Japan if possible).

But life goes on anyway. I'm getting some pretty darn good looking color laser output from the Konica Minolta 5430 driver. I'll have all the goodies checked into CVS soon, and hopefully there will be some people in free software land who will want to play with it.

Also, you might notice that the certifications and diary ratings are working again. With some help from pphaneuf, I tracked the problem down to some zero-length profile.xml files in the database, which happened when casper's disk became full (a consequence of having spam^H^H^H^Hmail delivered to the same partition as Advo's db, a bad idea), then some of the virgule code segfaulted on the NULL pointers that were coming back. I'm hoping to get some time to add some new features, maybe by integrating StevenRainwater's code.

I'm also continuing to hack a bit on Ghilbert, and have also been participating in some of the discussion on AsteroidMeta. Ghilbert is still very much a stealth project, but I'm feeling closer to opening up browsing and posting on the web site. At some point, I'll need to decide between adapting existing wiki/web board software or crafting my own.

Cross-platform UI's

Another long-time fascination of mine has been tools for building cross-platform UI software. I've never been thrilled with any of the alternatives available, although of course wx comes close.

My main frustration stems from the difficulty of making really good cross-platform apps, which have a truly native look-and-feel on the three most important platforms: OSX, Windows, and X. I'm convinced that the Java philosophy of "write once, run anywhere" is wrong for desktop GUI apps. You want access to the goodies that the platform provides, and you need behavior to be different on different platforms. Writing for a lowest-common-denominator abstraction layer gets in the way. There's a tremendous amount of good knowledge encoded in the Apple Human Interface Guidelines, and I think it's far more important to design to those guidelines when running on the Mac platform than to be able to run the exact same source code on all your targets.

I've been experimenting with a somewhat different approach. You have a very thin layer on top of the native GUI toolkit, which gives a common interface to all the vanilla stuff like creating menus and buttons and so on, but ultimately you're building three different apps for the three platforms, with hopefully around 90% of the code shared. Much less than that and the duplication is onerous, and much more than that you're putting too much pressure on the abstraction layer without much benefit.

One reason, I think, for such an approach to work is that the design principles behind GUI toolkits are converging, not to mention the underlying system support. In the bad old days, a native Mac Classic, Xt, and Win 3.1 app would be very different beasts indeed. These days, you can rely on Unix-style processes and memory management. More to the point, I find Carbon HIViews and Gtk+ widgets to be strikingly similar. In both cases, you have a mainloop which uses callbacks to send events into the application, you have widgets implemented as objects-in-C, and in general what's different is details.

And I've got some specific ideas for smoothing over those differences in detail. For one, I think Tk has the right idea of using strings to communicate between the app and the UI toolkit, especially for callbacks into the app. It's really easy to deal with strings across platforms, languages, and so on, while the toolkits tend to have large, nasty, and opaque interfaces.

So far, my prototype seems to work nicely and the code is fun. Is there much interest in such a thing, or am I barking up a tree?

397 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!