Older blog entries for Raphael (starting at number 26)

David O'Toole writes:

[...] Looking at stuff like this makes me get just a tiny bit upset about how badly the linux world is dragging its political feet with respect to improving the interface. I'm not talking about making all the OK buttons respond to the Enter key (currently my biggest pet peeve about GNOME, and it's slowly being fixed---recent GIMP etc.)

I'm talking about the imaging model. I don't want to criticize X unfairly. The X Window System was brilliant for its time and in its environment. But it simply does not support what people want to do now well enough to continue. Fast vector imaging, transparency, high-resolution monitors, antialiasing. Yes, you can implement software on top but there's no standard and it's slow.

The first defense I hear all the time is network transparency. I respond: who cares.
[...]

Well... I, for one, care very much about the network transparency of X. I am currently typing this from a Solaris machine on which I have other windows displayed remotely from a Linux machine and other Solaris machines. Not only some XTerms and Emacs that could also work over telnet/rsh/ssh, but also graphical applications like Purify, Quantify, Netscape, XMMS and some other goodies. They are all on the same LAN so speed is not really an issue. Without X's ability to display anything anywhere, writing and debugging my programs would be much harder.

So maybe I am among the 1% of people who really use the remote displays and would not be satisfied with text-based remote logins. This does not mean that nothing should be done for the other 99% who would like to get a much better performance from the applications that are running on the local display.

I don't think that it is necessary to throw X away and to start again from scratch. The DGA extension (available on OpenWindows and XFree86) proves that you can get decent performance out of X, although this requires some specific code that is rather ugly and not easy to write and maintain. Most programmers do not want to write some additional code for specific X extensions, and indeed they should not be required to do so.

But it would be possible to get a better performance while keeping the X API. Imagine that someone modifies the shared X library (libX11.so) so that if the client connects to the local server, all X calls which are normally sent to the X server over a socket would be translated into some optimized drawing operations accessing the video buffer directly. The shared X library would more or less contain some bits of the server code (actually, a stub could dlopen the correct code). If the X client connects to a remote server, then the X function calls would fall back to the standard X protocol. All clients that are dynamically linked to that modified library would automatically benefit from these improvements without requiring any changes to the code. So it can be done without throwing away the benefits of X. Actually, I believe that some people are working on that for the moment...

Question: maximum information density in the print-scan process?

Does anybody know how much information can be stored and reliably retrieved from a piece of paper, using a standard printer (inkjet or laser, 300dpi) and a scanner (1200 dpi)? Since a piece of paper can be affected by bit rot (literally) and can be damaged in various ways, some error correction (e.g. Reed Solomon) and detection (e.g. CRC) is necessary. Also, I do not want to rely on high-quality paper so I have to accept some ink diffusion and "background noise" introduced by defects in the paper.

I found some references to 2D barcodes (such as DataMatrix, PDF-417 and others) but these codes are designed to be scanned efficiently by relatively cheap and fast CCD scanners. I am not worried about the scanning time (I am using a flatbed scanner) or the processing time (I can accept some heavy image processing). Also, I would like to encode raw bits and pack as much information as possible on a sheet of paper, regardless of its size. These 2D barcodes have a fixed or maximum symbol size and it is necessary to use several of them if I want to fill a sheet of paper, wasting space in the duplicated calibration areas and guard areas.

PDF-417 has a maximum density of 106 bytes per square centimeter (686 bytes per square inch, for you retrogrades), which is quite low. It is certainly possible to do better, but I would like to know if there are any standards for doing that. I am especially interested in methods that are in the public domain, because most 2D barcodes are patented (e.g. PDF-417 is covered by US patent 5,243,655 and DataMatrix is covered by 4,939,354, 5,053,609 and 5,124,536).

If you know any good references, please post them in a diary entry (I try to check the recent diaries once a day, but I may miss some of them) or send them to me by e-mail: quinet (at) gamers (dot) org. Thanks!

Hmmm... This is a bit long for a diary entry. But I don't think that such a question deserves an article in the front page. If you think that I should I have posted this as an article, then send me an e-mail and I will re-post this question and edit it out of my diary.

I posted my opinion on using GdkRgb in Ghostscript, in the LinuxToday article about Raph's open letter to the Ghostscript community. IMHO, GdkRgb is the best solution and those who see it as an attempt to force them to use "Gnome stuff" on their desktop do not understand the way GhostScript works or what GdkRgb is.

This is not new, but it looks like anything that mentions Gnome is flamed by KDE bigots, and vice-versa (yes, it does happen both ways). The interesting thing here is that the most vocal critics are not developers and/or show clearly that they do not understand what they are talking about. Sure, they want someone (who?) to fork GhostScript, presumably to create a highly productive KDE branch or something like that. What a bright idea! Sure, they could get rid of any Bonobo linking, but throwing GdkRgb away would be stupid.

Sigh! Even if you are careful about what you communicate (I think that Raph's letter was nice and explained very well that using GdkRgb would have no influence on KDE), some morons will find a way to interpret it in a different way.

22 Sep 2000 (updated 22 Sep 2000 at 16:58 UTC) »

I'm going to Bristol (UK) for the HUC2k symposium. I suppose that the probability of meeting someone reading Advogato at this conference is close to zero, but I will be there anyway. And I will stay in the Posthouse hotel from Sunday evening to Wednesday, so if you are reading this (maybe), and you met me at GUADEC or something (unlikely) and you will be in Bristol for the conference (extremely unlikely), then feel free to come and say hello.

Ghostscript

It is nice to see that Ghostscript has a new maintainer, in the person of raph. Congratulations and good luck! Ghostscript is already very good, and adding better antialiasing and other stuff from Libart will make it even better.

Hmm... There seems to be an account for L. Peter Deutsch on Advogato. Not very active, apparently...

Diaries, yet another meta-discussion...

At the end of a previous diary entry, raph mentioned that the diary format is working, but is not ideal for question-and-answer discussions. Well... Obviously the diaries were not designed for that, but it is great to see how they have evolved. There seems to be a need (among the free software community) for this kind of discussions, which are more public than direct e-mail, mailing lists or IRC, but whithout being restricted to a particular topic like the articles on the front page.

A first step would be to use automatic bi-directional links whenever possible. Whenever someone posts a diary entry containing a link to someone else's diary, the filter that parses the submission would at the same time add a backwards link at the end of the other diary (e.g. "[1 comment by so-and-so]"). It would then be easier to check if someone has replied to your diary entry.

But as the number of diaries grows, it becomes increasingly difficult to keep up with the postings. It will not take long before the daily submissions cannot fit on the front page. Already now, it is easy to miss some parts of a discussion if you go away for a couple of days. And the only way to read the missing parts is to look at the pages of all potential participants and check their previous entries. This is not very convenient, because you may forget some of them and you may not know that a new guy has posted some interesting comments. Of course, that could be solved by another hack to Advogato: allow the "recentlog" to take a range of dates, or at least a starting date. It would then display all diaries that have been posted or modified during that time, so that you could read last week's diaries in chronological order if you missed them. (Implementation note: Advogato should store a chronological index of all diaries, otherwise finding and sorting them would be inefficient.)

But where does that lead to? If it is easier to discuss things in the diaries, that part of Advogato would become similar to a web-based bulletin board or chat room. Or a web-based version of USENET. The comparison with USENET and other chat rooms is interesting: they allow threading (using a "References" header in the newsgroups, or direct links in the web fora) and they provide easy ways to separate the unrelated topics (different subject lines, newsgroups or chat rooms). The Advogato diaries put everything in one large page and it is up to the readers to separate the interesting things from the noise. But on the other hand, this can be considered as a feature that reinforces the community, because all members get the opportunity to read some articles that they might have skipped if the topics had been clearly separated. Also, another feature of the diaries is that they do not have subject lines: those who want to add them can do it (using bold and/or indentation) but nobody is forced to structure their diaries in any way. It is difficult to please everybody...

So I don't know what would be best for Advogato (anyway, who am I to judge?) but I think that there are several significant differences between the diaries and a full-featured discussion forum, and these differences may be good for Advogato. If nobody has enough spare time to add a discussion forum besides (and not as a replacement for) the diaries, then I am happy with the current situation. Hmm... Maybe it would be better with the addition of bi-directional links...

I wish I had the time to reply to raph's diary entries about the Tragedy of the Commons. There are lots of interesting ideas and things to discuss.

I need a break from work. I am going on vacation tomorrow.

There is yet another discussion of the QPL vs. GPL issues, on Slashdot this time. I posted a reply to some comments related to the two Freshmeat editorials. I wonder when these license incompatibility problems will be solved.

<rant>And I am always surprised by the number of people who fail to understand some basic things about licenses and copyrights. IANAL and I am probably wrong in some of my comments, but at least I try to get the basic facts straight and to see the difference between law and opinions. It looks like most contributors to these stories don't do that. Ah well... </rant>

I saw a new editorial on Freshmeat discussing the incompatibility between the GPL and the QPL version 1.0. It contains a good step-by-step explanation of the problems. I included a link to it as well as some comments about the editorial and its replies at the bottom of my previous article about the QPL.

I read Cees de Groot's diary in which he mentions that the Orbiten Free Software Survey credits him for a mere 39 K of code. Since I am curious, I decided to check what the survey says about me and I saw that it credits me (author 3745) for the incredible total of 28 bytes of code. Wow! I'm impressed...

For some reason, it did not find me among the contributors to the GIMP or some other packages to which I contributed. But after a bit more careful investigation, I found that the database had registered me several times, so I have the other profiles 7841 (credited for 21 K) and 9709 (credited for 55 K). Still, that's not much... The interesting thing is that I am credited for some projects in which I did not even know that my code was used. But on the other hand, I could not find my name in any of the projects to which I contributed directly (except for rplay). Funny... Anyway, I'm curious to see the results of their next study, which should be available soon.

While I am thinking about contributions to free software projects, I realized that I wrote some patches to the GIMP last month and I forgot to submit them. They have been sitting on my hard disk since then. Hmmm... Maybe I need more sleep?

During the last days, my SO told me a couple of times that I was strange (why am I not surprised?) She has seen me picking up some small flowers, dead bugs, stones and other weird stuff from various places, then running back home and putting them on my scanner. I told her that these would make great textures and brushes for the GIMP, but she is still thinking that I am a bit strange. She was disgusted when she saw me tearing off the wings of a dead dragonfly in order to scan them (I got the idea from here). But she liked the final result very much: dragonfly wings are beautiful and can be used as a GIMP brush to create very nice effects. Now I still have to find a dead locust somewhere...

Besides this and some more hacks and bug-fixes on the GIMP code and Script-Fu's, I also spent some time trying to convert scans or photos into tileable textures. There are many ways to do that, depending on which features of the original image should be preserved (the "make seamless" plug-in does not give good results most of the time). Maybe I should write a tutorial about that, because the tricks that should be used to make real objects tileable are different from the ones that are used to create textures from scratch.

Yesterday, I spent some time scanning random stuff that I could find in my house. I would like to create some nice brushes and textures for the next version of the GIMP (1.2). So I scanned some samples of wood, pebbles, skin, flowers and even bread and pasta. Since I scanned them at a high resolution (1200 dpi), I now have a number of large files (several megabytes each) that I have to shrink down to a few dozen KB if I want them to be distributed with the gimp-data-extras package. I hope that I will be able to produce some nice, high-quality textures.

I also hacked some Script-Fu logo scripts so that they can register in a new menu "Alpha to logo". I already sent a patch for the first five scripts on Friday, and I continued yesterday. Still a few more to go before all of them are converted. It is very nice to have these enhanced scripts when you have the GIMP Freetype plug-in installed.

I am not sure that I will ever publish the second part of my article about Moderation and Certification. I decided to wait for one month before publishing anything new because the first part of that article was published at a time when a lot of "meta" discussions were posted (sometimes as replies to unrelated articles) and some people were rightfully upset about the number of meta comments. Now the month is over and I still do not think that it would be appropriate to post the second part of my article. The best way to contribute would be for me to write the code instead of just talking about it, but on the other hand it is nice to have a discussion before implementing something that people may not like. Hmmm... I think that I will wait a bit more before deciding what to do with that article.

17 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!