Why Sun is right that Java sucks

Posted 10 Feb 2003 at 07:57 UTC by exa Share This

This internal memo linked from slashdot caught my attention. It details why Java should not be used in Sun's own production software. Summary is "Java implementation sucks" and here is some input from me.

Back in 97 I had used Java to write a small 3d graphics library, which I found to be quite poor in performance. Years later in 2001, since there was a new version of Java I wanted to give it another shot.

Here is the paragraph from the Sun memo that really gave me the chills: Further examples of what is possible include the compiling OO languages Eiffel and Sather which fit their garbage collector, exception processor and other infrastructure into roughly 400K of resident set. While the Java VM (as demonstrated above) grows rapidly as more complex code is executed, the Python VM grows quite slowly. Indeed, an inventory control program written entirely in Python having a SQL database, a curses UI, and network connectivity requires only 1.7M of resident set. This seems to indicate that the resident set requirements of the JRE could be reduced by at least 80%.

Aha! That VM problem I couldn't know better. I implemented a graph-based clustering algorithm and even integrated it in Weka (a machine learning framework written in Java). Since the language also sucked it wasn't so easy to write it (you have to do casts everywhere when you implement a generic data structure). Anyway, it wasn't that different from writing in C++. The crux of the matter was the awful performance. It was something like training data of a few thousand elements being processed and input to the clustering algorithm. It performed awfully bad, consuming about 100M of memory! It took several minutes to run, when the quite efficient algorithm, if written in C++, would probably complete in a matter of seconds. What it does is, it makes a similarity graph and then does agglomerative clustering in a multi-level fashion like Karpyis's code with some new heuristics that I was trying. So it should have a complexity close to O(E) in practice, it wouldn't take minutes... In whatever way you view the problem, it was unacceptable. The whole Java VM concept was so flawed you couldn't run a single not-so-expensive algorithm with acceptable performance on this system.

There are even occasional idiots on beowulf mailing list and other parallel programming forums and sometimes academic conferences that claim to be using Java for HIGH PERFORMANCE COMPUTING. That is the worst joke of the century. Java was not meant for the high performance domain where you take things like cache coherency into account while implementing your algorithm. You have to think twice before you say "high performance" in a runtime that is an absolute memory hog.

Well, of course the Java advocates are not so aware of this problem because much of what they do is slow stupid GUIs and web annoyances. They think the performance is acceptable to them.

Not to me.

I said back in 97 that Java sucked and showed as the proof the lack of desktop applications on ANY system. Now it's 2003, I am saying that Java sucks and I'm showing the lack of desktop applications on ANY system to prove it. Here I look at my debian box, no significant Java apps, maybe except Freenet which isn't a desktop app.

No surprise, even Sun engineers know that Java can't be used for any serious application.

__

exa


Java has a GC problem, posted 10 Feb 2003 at 16:55 UTC by tjansen » (Journeyer)

IMHO Java suffers from a garbage collection problem. If you don't allocate objects and maybe use only static methods, Java can be quite fast. But when you start creating huge amounts of objects (like required when working with Java's String class) its memory use and performance are getting worse and worse.

In theory GCs should be at least as fast as manual memory management or reference counting (which Python uses). Instead of wasting time for memory management while the program is working, it defers the memory management until the program is idle or it runs out of memory. Unfortunately on today's systems, memory is extremely slow and CPU cycles are cheap, and this is why the GC theory does not work. The Java VM constantly trashes the cache because it does not re-use memory fast enough. Instead it takes new (usually uncached) memory for new objects und defers freeing the unused memory of old objects (that are in the cache). This is probably the worst thing that you can do to the cache. A good VM would try to re-use memory as soon as possible, to increase the chances that it is still in cache (like Python's refcounter). Java does the opposite.

To make things worse, the VM seems to lack any coordination with the kernel. When the system is running out of RAM and needs to swap, the logical action for the VM would be to start the garbage collector. It doesn't however, and instead it starts allocating the new memory, forcing the kernel to move the old (unused) memory into the swap space! And when the VM finally decides to start the GC it will go through all the unused memory that is now in the swap, causing it be reloaded and possibly moving more frequently used memory back in the swap, only to re-load it again later. How much worse can it get?

Java is fine for the majority of its uses, posted 10 Feb 2003 at 17:50 UTC by AlanShutko » (Journeyer)

It's important to look at how Java is really being used. Mostly, it's for web applications, which consist of a fairly thin layer of glue between the web server and some data stores. In this setup, the problems of Java don't show up too much. There's only one VM which is running all the time. Startup time isn't big. You generally do very little per request, and you have so much communication that fast processing wouldn't help you out.

Most of this programming doesn't use anything fancy enough to be called an algorithm. There are exceptions, of course, but they're rare enough you can devote extra attention to them, by someone who knows enough about the Java language and the way it's implemented to know the performance pitfalls.

I'd guess that you could have avoided most of the performance problems on your clustering algorithm by careful avoidance of copies and memory allocation. Java makes it easy to ignore these costs, but it doesn't make them go away.

I'm not trying to be a Java apologist here, because your example does show that the hype of Java isn't all true. Writing Java doesn't free you from worrying about memory allocation (or other things they claim are harder in C or C++) in all cases. It may be better than C/C++ in some ways, but a balanced view of it is necessary. And if you want a language that makes things substantially better, look somewhere else.

Re: Freenet, posted 10 Feb 2003 at 17:53 UTC by gary » (Master)

I firmly believe that the choice of Java has consigned Freenet to obscurity, which saddens me greatly.

GC not inherently bad, posted 10 Feb 2003 at 17:55 UTC by AlanShutko » (Journeyer)

I've heard that Java's GC just isn't very good, compared to how well a GC could work. For instance, here's JWZ's assessment. But a quick check couldn't find any recent analyses of Java's GC vs. the state of the art. I know that it's using a generational GC these days, but I don't know how good it is. (I do know that objects can get old enough that they're collected once a day, which was annoying to us on one project....)

Does anyone know more?

The sad thing is... Java is what's being pushed hard, posted 10 Feb 2003 at 18:15 UTC by davidw » (Master)

Look at job boards like Dice, and the numbers are quite depressing:

Java: 2842
C++: 2206
Perl: 839
C#: 322
Tcl: 86
Python: 71
Lisp: 12
Ruby: 5
Management, in many cases, wants Java.

I don't think it's the wrong tool for the job in all cases - it's not bad for some things, but I don't think it's the right tool in a lot of other situations.

There is no lack of Java desktop apps, posted 10 Feb 2003 at 18:28 UTC by robilad » (Master)

It's funny to read someone complaining about the lack of java desktop apps just days after the announcement of a Java branch at the open source directory : http://www.advogato.org/article/622.html

I contribute to Kaffe OpenVM. Guess what's the most common issue that makes a lot of Mandrake Linux users deinstall kaffe and install Sun's JDK? They want to run LimeWire, a desktop application for file sharing written in Java. That's what a simple google news search for kaffe and mandrake will tell you. And it's just one of several file sharing desktop applications written in Java. I'd assume that a lot more Joe Average users run file sharing apps written in Java then graph-based clustering algorithms written in C++ ;)

Fancy a mail client? Check out Columba, or ICEMail.

Do you want a text editor with that? Check out Jext. Or just get a full IDE, while you're at it: NetBeans, Forte, JBuilder and Eclipse are all written in Java and run on a, you guessed it, desktop.

I don't feel like spamming advogato with links to many other java desktop apps, from X11 server to ogg player, from office suite to XML editor. You can get some of them packaged as RPMs at JPackage. You can find others using your favorite search engine. Many java applications are listed on JARS. Just because Debian's doesn't have something packaged doesn't mean it doesn't exist ;)

If you want to argue memory usage, you should note that Sun's implementation is not the only one out there. For example, in the tests on http://www-106.ibm.com/developerworks/library/j-native.html , kaffe 1.0.6 uses up to 20 times less memory than Sun's JDK.

GC-clue, posted 10 Feb 2003 at 18:50 UTC by johs » (Apprentice)

And to go with JWZ's assessment, have a look at Uniprocessor Garbage Collection Techniques for some clues. (Yes, that was directed at you, tjansen.) Great reading, anyway.

Re: GC-clue, posted 10 Feb 2003 at 19:04 UTC by tjansen » (Journeyer)

I have yet to see how to make GC work efficiently without combining it with reference counts (your link is currently broken, johs). And I am certainly not the only one with that opinion, see for example Linus Torvalds GC comment on the gcc mailing list

What about gcj? what about C# (Portable.net) and Pike?, posted 10 Feb 2003 at 21:46 UTC by atai » (Journeyer)

For these "high performance" applications in question, have people compiled them with gcj (GNU Java Compiler) and run as native executables? With a traditional compiler, do the GC-related and other performance problems still exist?

How do C# (DotGNU Portable.Net) and Pike, two other Java-like scripting languages, fare for these high performance applications?

Shared libs, posted 10 Feb 2003 at 23:56 UTC by djm » (Master)

It seems like Java throws away all the experience of implementing modern operating systems. Class libraries don't seem to be shared, executables are (very) heavyweight, libraries aren't versioned.

GC probably not the problem here, posted 11 Feb 2003 at 00:12 UTC by dan » (Master)

Well, let's see. The Java language specification requires that an implementation include GC. That means that Kaffe, and the Sun NT implementation, and all those other Java implementations that apparently don't suck by comaprison to the Sun Solaris implementation under discussion, must also contain a GC. Not to mention Eiffel and Sather, also cited in the original memo, also having GC, and also not sucking.

You are of course welcome to your humble(sic) opinion that the reason Solaris Java sucks is an inherent property of garbage collection, but on the evidence available I have to suspect there may be other factors involved.

Incidentally, the link to Paul Wilson's paper works fine for me, but note that it's ftp and the site may have been full when you tried. Try it via http instead: http://www.cs.utexas.edu/ftp/pub/garbage/. It's a very well-known paper.

Side note: as you say, in theory GCs could be at least as fast as manual memory management or reference counting (and sometimes, faster: consider the amount of pointer chasing involved in freeing a big collection of objects which all reference each other). Even in practice you can get an awful lot closer than Solaris Java apparently does. For example, use a generational GC, and size the nursery generation to fit inside the cache.

In the end, no general automated scheme will outperform the theoretical best job you can do with application-specific knowledge and a hand-crafted allocation regime, but, you know, in general no high-level language will outperform hand-crafted assembler either, and I don't see many people still saying "C sucks! Stick to assembler and stop wasting cycles". It's all about bottlenecks, and it really doesn't sound to me like a problem with the conceptual nature of GC is the bottleneck here.

Java has library versioning, posted 11 Feb 2003 at 00:33 UTC by robilad » (Master)

It's specified here: Extension Versioning

bigsurv.gs and JWZ, posted 11 Feb 2003 at 01:17 UTC by tjansen » (Journeyer)

Ok, looked over the bigsurv.gs, and as far as I can see it does not even address my concerns. This is not really surprising, since the paper is from '94 and most of the quoted papers are even older. Back then efficient caching was hardly an issue, since the CPUs were not much faster than the memory. If I missed something about that topic, please point me to the section.

What I wonder is: how can a GC ever be faster than manual memory management or reference counts, when it re-uses memory later (and thus is less cache-efficient)? This is the question that no one answered to far. It doesnt matter if you need to chase pointers, this is almost completely irrelevant compared to cache efficiency for short-lived objects. You must be doing *really bad* manual memory managements if you release your memory as late as a GC collects it. Reference counts should provide almost perfect results (unless you have the famous cyclic references, but that should not affect more than 10% of all objects in a language like Java).

BTW concerning JWZ: all he says is that Java as a language does a bad job of helping the GC and does not offer the application developer to provide hints. Definitely true, if the byte code would deliver hints better about the life-span (or would use a stack-based allocation equivalent, like C# offers IIRC) it would certainly have less problems.

Java, an understanding, posted 11 Feb 2003 at 03:36 UTC by SyOpReigm » (Journeyer)

I know many smart people that seem to love Java, and I think I know why, Java is easy to use. The one problem with Java, and why it will never become the major main stream language that some people want it to be is that it is a rubber safty knife. Now hear me out. Now Java prevents you from doing many things that C++ will happly let you do (read: Pointer Arithmatic). With this, come a lot in the way of overhead. And also, lets not forget it is, in most cases an interpreted language, and two identical programs, one compiled and one interpreted, the compiled one will always be faster. While in a few cases Java may have it's place, it is more of a novalty then anything else, and for most cases can be replaced with something that would do the job better. Flash and PHP for the web, C++ for apps. Even C++ is portable if written correctly. Give a few years, and a little luck, every OS will be Posix complient then the major selling point of Java, write once run anyware (AKA: Write once, run NO WARE, as we called it back in '97) will become meaningless.

GC, posted 11 Feb 2003 at 04:39 UTC by dan » (Master)

Ok, looked over the bigsurv.gs, and as far as I can see it does not even address my concerns. This is not really surprising, since the paper is from '94 and most of the quoted papers are even older. Back then efficient caching was hardly an issue, since the CPUs were not much faster than the memory.

No, of course they weren't. Heck, back in those ancient days we'd barely even got used to using transistors instead of valves.

Sheesh, five minutes with Citeseer is adequate to demonstrate that GC researchers were studying locality and cache issues even back in the stone knives, bearskins and jumpers for goalposts days of 1991. Here's two clues: Zorn and Wilson. It's not the case, pleasing though it may be to imagine, that everybody was living in blissful ignorance of the issues and nobody even realised their computers had caches until Linus Torvalds pointed it out on a GCC mailing list.

For what it's worth, there's an article on JVM GC tuning for 1.3.1 at http://java.sun.com/docs/hotspot/gc/. We can infer from that that the GC is generational, that minor collections copy, that the default sizes are probably pretty screwy for most purposes, and that the GC still stops the world whenever it kicks in. These are all techniques that were known in 1994

GC can be faster, posted 11 Feb 2003 at 11:03 UTC by robilad » (Master)

Let's say you have a lot of memory, and a large heavily interconnected data structure. As you do your processing on the data structure, you remove many nodes, leaving you with a really sparse data structure. It remains heavily interconnected.

Now you have to free the data structure. Remember, it is now a sparse data structure, with, let's say, only very small elements left.

In the best case for a copying garbage collector, your GC has already been shrinking the working set of your application as you thinned out the data structure, so you're left with a few pages of very small objects nicely lined up. Freeing the data structure is just a matter of substracting the size of the data structure from the live object pool size.

In the worst case for manual memory management, you're left with small bits of data spread out over all pages of memory (and some on the swap, for added speed penalty). You have to go through every page of memory in the system to free the data structure.

Regarding GC, posted 11 Feb 2003 at 12:06 UTC by tk » (Observer)

Hans Boehm did a presentation on memory allocation myths.

Ho hum, posted 11 Feb 2003 at 15:27 UTC by aeden » (Journeyer)

How many times will I have to listen to people tell me how bad Java is? As a long-time Java developer I am constantly bombarded by people telling me that Java sucks because it is interpreted, because its garbage collection is slow, because it uses too much memory, because it has a bad name, because Scott McNealy smells...all the while I have been writing server- and client-side Java applications right next to my Perl, Python and PHP scripts, shell scripts, and whatever else turns out to be the right tool for the job.

Enough already! I use Java. It works. I have built many applications using Java and they are used every day in day-to-day business applications. Sure, Java is not perfect, but it is quite adept at handling most of my programming needs. If I ever need C or C++ then I will use them.

Please, lay off the rhetoric because I am tired of hearing it.

parrot fever - see psittacosis., posted 11 Feb 2003 at 17:32 UTC by sye » (Journeyer)

"Take these issues seriously" posted on Feb 11, 2002 @11:30 by Matthew Pierret.

I am a big supporter of Java and have used it exclusively for six years. I don't want to go back to any other language. But the cavalier attitude toward backward compatibility, gratuitous API changes and the introduction of bugs in new releases is unacceptable in a production environment. Putting the release cycle under greater discipline (including comprehensive backward compatiblity regression testing) with an emphasis on maintaining backward compatibility would be a significant improvement.

While some may use the memo to try to undermine support for Java, dismissing this memo does not serve the interests of the Java community. The issues are real and make it that much harder to counter the voices calling for a homogeneous Microsoft world.
from Developer.sys-con.com

Need To See The Code, posted 11 Feb 2003 at 21:24 UTC by nymia » (Master)

Maybe looking at the code may get some things sorted out, though. But I doubt that will ever happen since Sun owns Java and plans to keep the source locked.

Also, I noticed Java slows down my machine down to a 386 class CPU whenever I load Forte, it is just soooooo slow. Slower than Visual .Net load time.

Regarding GC issues, maybe Sun mgmt. needs to listen to their engineers and let them pick which one is the best GC, maybe adopt the Ruby GC, for example.

Why I like C++, posted 12 Feb 2003 at 00:39 UTC by MichaelCrawford » (Master)

I've done a fair amount of Java programming, enough to learn to do some fancy things with it, but I must say I prefer C++.

The general reason I prefer C++ to Java (or most other languages) is that I feel it enables me to be the best developer I can be.

I'm sure that many excellent programmers program in Java, but my feeling is that the very best C++ programmers can produce better results than the very best Java programmers.

However, I think Java is popular with management because it is easier for an average programmer to get something working sort of OK with Java than it would be for the average programmer to do it in C++.

The sad fact is that many programmers just aren't that skilled. Some folks aren't that smart, some folks aren't very experienced, and some haven't had adequate education.

It took me a long time to become a good C++ programmer. I'm not trying in any way to claim that C++ is easier than Java. It is a powerful and dangerous tool that yields great rewards to those who master it.

Milling machines are powerful and dangerous tools for metalworking, and you certainly shouldn't try to use a milling machine without learning how, but if you do know how you can do things like make car engines out of hunks of metal.

A common misconception about garbage collection (and by extension, Java programs) is that garbage collected software does not suffer from memory leaks. Nothing can be further from the truth. My experience is that memory leaks abound in Java applications.

How can this be? you protest. Well, I'll tell you. You just have objects that contain references to memory you don't need anymore. The memory these objects reference won't get garbage collected.

However, a leaky java program will probably still mostly work. Solutions I have seen used in web server production environments include such things as killing and restarting the VM every day or so.

When I was a smalltalk programmer, we used to save memory dumps into executable image files. That way we could restart our development environment right where it was when we shut down the PC the previous evening. It was nice in theory, except for the memory leaks. After about a week of dumping and restarting a given image, performance would become so slow as to be unusable.

I resolved this by keeping two images, one that I would keep in a virgin state except for weekly updates from the source code repository, and the other that I would copy from the virgin one each monday and only use for a week.

Yes, it is harder to manage memory in C++ but I feel that it is the proper job for a developer to take personal responsibility for his memory consumption.

It really is a better way to live. Efficiently managing your own memory will make you a better person.

I used to have a lot of trouble with C++ memory leaks until I started using the Spotlight memory debugger from Onyx Technologies. When I saw how leaky my programs were I started learning more about how to manage memory. After a while it became such second nature that I write code that is leakproof without having to pay much conscious attention to it.

I wrote an article about a refactoring process as I learned to manage memory better.

You might protest that only incompetent programmers create memory leaks in Java code but I have heard that Java libraries distributed by Sun for production use contain leaks - even Java meant for embedded use, where memory is precious.

I'm sure that skilled Java programmers can write leakproof applications, but I think the problem is that most don't feel the need to learn the discipline because they think the language will take care of them.

You should understand that I advise neophyte programmers to learn Java as a first language. It is easier for the inexperienced to get something working in Java. But I also advise them to learn better languages when they become good enough that they want to write programmers for real end-users to use.

Flame on,

Mike

Re: Freenet, posted 12 Feb 2003 at 06:28 UTC by obi » (Journeyer)

I completely agree with the assertion that java's the reason for Freenet's current lack of succes.

Okay. Freenet is trying to solve a hard problem. But my perception is that their reliance on java really doesn't improve the development speed or the performance of the network.

Every now and again, I try to install the damn thing on my servers. Already I'm annoyed about it pulling in all kinds of dependencies (like x libraries), but when I see the cpu overhead, I cringe even more. I'm longing for the days when it works with gcj, hoping that will solve some of it's problems someday.

It's probably partly perception, but I much much rather run some python-based server code, or even a quake3 server on my machines. I just feel really uncomfortable with java code on servers.

I still remember when Java was this great promise, and reading in my java books that there was a reasonable tradeoff between performance, and ease of use. I still think the language is rather nice (like rather easy threads, interfaces instead of multiple inheritance, etc), but the VM is slow, and the runtime/classes are worse than bloated (a lot of people dislike swing).

Someone mentioned applications, like Limewire (which I knew of), and colomba-mail (which was a rather positive surprise). But even with these apps the startup times are horrendous. And yes, startup times are important, and are becoming imho even more important in the future.

Even though you mention these nice examples, the fact is that these apps, as nice as they may be, are more the exception than the rule. Considering how many people are programming Java these days (it was taught in universities all the time during the later nineties, and it's a very marketable skill), and considering how easy it's supposed to be to make large apps with it, it's exceptional to see how few actual usable programs came out of this.

Sun is wrong - Java doesn't suck, posted 12 Feb 2003 at 06:49 UTC by jpick » (Master)

Perhaps you should try Kaffe. If you've got a problem with the implementation, it's open source, and you can fix it. There probably a half dozen other free implementations out that are quite nice as well.

Kaffe has stayed relatively small, and a lot of people are using it for embedded targets.

Kaffe may be trailing a bit in getting the newer APIs implemented, and it's GC and JIT subsystems are more simple than state-of-the-art, but it's fun to work with.

The nice thing about playing aound with a VM implementation source directly and trying to understand it is that you start to learn why things are the way they are in Java, and where the "traps" are.

There is a lot of really good Java code out there. But you really have to keep a close eye on things when you have tight memory requirements. A lot of Java library code was not written for environments where memory consumption was a big concern - so a random chunk of code may make certain size/speed tradeoffs that might not fit your needs.

If you write your own code, you can do almost as good as native code on applications over 1MB in size. Frankly, most native code (eg. C and C++) does a piss poor job of memory management - most GC implementations will do a better job, and automatically. Of course, a clueless programmer is going to have troubles in either environment.

For embedded programming, I've found Java to be really nice. Say goodbye to segfaults. And if you are sloppy, things degrade nicely - things just get slower as GC kicks in more and more often (assuming you fix the heap size). Plus the commercial debugging and profiling tools available are simply incredible - you can literally see everything.

There is a lot of cutting edge work being done on advanced multi-heap, thread-local heap, fully concurrent GC implementations. Projects such as KaffeOS, JanosVM, and RTSJ have developed ways to gain even more control over the GC process - so even realtime processing with Java is within grasp.

Sun does deserve some criticism for adding a lot of bloat to their VM libraries. The core VM doesn't seem to bad, but Swing apps in particular hurt. IMHO, they never really got their UI house in order - they just added a lot of bloat. Fortunately, for free software types, one doesn't have to stick with the traditional solutions. eg. Over the years, with Kaffe, there have been literally dozens of windowing toolkits stuck on top of it...

And Java can be compiled either AOT (ahead-of-time), eg. with GCJ, with a JIT, or even interpreted. That's a lot of flexibility. And with advanced compilation techniques, the quality of the generated machine code can be really high. There's a lot of research going on into optimizing compilers for Java, as it's a big market, and also the core problem space is small enough that it's a great space to experiment in if you're a compiler freak. At the end of the day, I truly believe that dynamic compilation techniques coupled with advanced control systems theory will eventually prove to generate faster, more optimal systems than anything static compilation techniques will be able to achieve. Static compilation has to live with all the invariant assumptions they have to make ahead of time with no feedback loop.

The future of application programming lies in separating the model from the UI. If you get into serious modelling work, it's hard to beat the flexibility and the tools that are available to today's Java developers. The language is a natural for modelling and refactoring, and IDEs such as Eclipse and NetBeans are incredible (and free). There are literally dozens of application server frameworks aggressively fighting for mindshare on what is the best way to develop Web applications -- see Apache's Jakarta project. Sun's influence has been positive in many ways, as even though there are zillions of solutions, there are standard APIs for things like XML, SQL, and servlets that everybody can agree on. It makes it really easy to mix-and-match and play to come up with the best way to do things.

I've seen some pretty incredible UI front-ends on Java. I think the future for UI design lies with XUL, XForms, and other XML-based UIs, such as Glade, SVG, SMIL, etc. If you want to play with some free software, take a look at Luxor, and X-Smiles. UIs of the future will the standards-based, themeable, and very cool. Java will be there, since it's really good at XML and dynamicism.

Cheers,

  - Jim

Sun /didn't say/ that Java sucks, posted 12 Feb 2003 at 14:59 UTC by dan » (Master)

Read the memo. Sun said that one particular implementation - the Solaris JRE - sucks, and for reasons mostly to do with the development, release process and upgrade paths. That was not a criticism of the Java language.

(That's quite apart from my own personal opinion, of course. If java "is a natural for modelling and refactoring", I really don't want to know what contortions you're willing to put up with before you consider something "unnatural" :-)

Re: Java is fine for the majority of its uses, posted 12 Feb 2003 at 23:27 UTC by brane » (Journeyer)

It's important to look at how Java is really being used. Mostly, it's for web applications, which consist of a fairly thin layer of glue between the web server and some data stores.

*sigh*

I wish it were so simple. Back at work, we're writing a Web service (in Java, of course) that has to digitally sign XML messages. Signing a few hundred bytes of XML with a 2kbit key is not just slow -- it takes more than a second! Even replacing the existing crypto provider with a native one that uses openssl doesn't help, because JNI sucks rocks performance-wise.

So no, Java's performance is not acceptable for anything but the most brain-dead Web apps, and that's not taking into consideration the memory footprint on the server...

Re: Freenet, posted 13 Feb 2003 at 00:07 UTC by Bram » (Master)

My impression is that Freenet got out of control because it gave everyone and their brother commit access and the codebase quickly became bloated and unmaintainable. It had almost nothing to do with the choice of language.

Sun of course doesn't say &quoJava sucks&quo, posted 13 Feb 2003 at 04:30 UTC by exa » (Master)

How can they? :) Actually I said that.

And looking at it language-wise, it doesn't really suck. However, the VM's I've tried such as the IBM JDK 1.3 and blackdown (I also used some stuff on windows in the old days), didn't seem to be high performance at all. Also, I have a gut feeling that the VM's don't just perform badly because of deficient implementations but because of Java language and VM specification. Even Java chips can't rescue Java (Have you ever heard of an ocaml chip or a C++ chip?)

Most notable is I think Forte. It's like a joke. I never got to run it properly and it consumes ridiculous amounts of resources. It really turns your machine to the previous generation.

I'm not a great fan of C++ either, but at least it is a general purpose language. That's to be commended.

When Java first came out it was: Java is the programming language of WWW. Then, Java will run everything from desktops to embedded devices. Then they said it's good for web servers, and now I think they're going handheld or whatever god knows (maybe it's really good for handhelds I don't have any experience with the one in Zaurus or anything like that). That shift of focus is not very promising.

On the other hand, maybe we should give more time to Java. Look how many years it took for the first real C++ compilers to come out. Maybe it's just a matter of time.

Still, I can't imagine using Java for anything that requires some sort of an algorithm more complex than quicksort. I can picture in my head GUIs, network agents and that sort of thing but I can't see Java accomplishing all sorts of stuff.

The more annoying thing is Sun people coming to universities, convincing/bribing/whatever people and then changing the curriculum to Java. So universities drop an ISO standard and adapt Sun's standard. Then Microsoft comes and tries to change the curriculum to C#...

That's crazy. That's not even in the best interest of Sun or Microsoft. Maybe it's better for them than it's for others but then again all CS101/CS102 assistants complain about students not understanding any basic idea about programming because students think it consists of Java's oddities.

Ah, and that's another thing they said "Java is good for education". Really? I don't think any C variant is good for educating the principles because let's face it please, C is *not* a principled language. It's the ultimate product of hacker mind set, maybe, but it isn't a good programming language design. It lacks even very basic concepts such as nested scopes (function in a function) and all derivative languages share some of its limitations. The one that tried to extend it sometimes made it too complex and ugly (like C++) and sometimes they failed on the performance and system integration part (Java, maybe C# <-- they say it's better) I would definitely choose Pascal or LISP (not very intuitive) to teach programming. And maybe let them work with some toy languages before that. (Our instructor had done that)

von Neumann languages suck ..., posted 13 Feb 2003 at 08:27 UTC by mslicker » (Journeyer)

paraphrashing John Backus's 1978 Turing Award Lecture Can programming be liberated from the von Neumann style?. C++, Java, ect, ect, ect, ... are all von Neumann languages acording to his clasification.

I've convinced myself these languages are relics of past. When you have used languages as elegant as Forth, Lisp, or ML, it is hard going back to what seems a more primitive tedious form of programming.

This discussion is going beyond Java..., posted 13 Feb 2003 at 13:59 UTC by tk » (Observer)

Erm... the "basic" ideas of programming? There are at least two views of programming: one is that a program specifies what to compute, another is that a program specifies how to compute something. Both views are equally valid, so I'm not sure if there'll ever be an agreement on what constitutes the "basic" ideas of programming.

And isn't Forth a von Neumann language too?

Forth chip + Forth language, posted 13 Feb 2003 at 15:04 UTC by sye » (Journeyer)

What's so great about Forth language is that its inventor Chuck Moore split his living between designing his own chips and writing his own Forth. The reason von Neumann becomes von Neumann as we know it in pioneering computing world is because Uncle Sam happens to lend von Neumann his big pocket.

Language flame-fest, posted 13 Feb 2003 at 18:40 UTC by raph » (Master)

I'm afraid that the real, and interesting, issues in the memo are being obscured by run-of-the-mill language flaming.

First, let's separate Java-the-language from Java-the-class-library. Generally, I like the language, but from what I've seen of the class library, it's fairly bloated, contains numerous infelicities, and isn't getting that much better. The root of the problem, of course, is that it's controlled by Sun.

Sun's implementation has performance problems, for sure. The way to fix that is to have healthy alternative implementations. Unfortunately, these have had a hard time gaining traction, largely because of the need to reimplement the huge class library in addition to the language proper.

Any time you adopt technology controlled by a single corporation, you run the risk that its defects won't be easily fixed, and (more importantly) that its evolution won't go in the direction you like.

So there are at least two separate issues here. Even if one accepts that Java is beautiful and elegant, the politics and processes around it are not. People who are making a decision to adopt Java should know what they are getting into, and shouldn't restrict themselves to only looking at the technical issues.

exa: I don't share your "gut feeling" that the performance problems are caused by the language and VM specification.

I do think that Java is a reasonable language for teaching programming, because the language itself is pleasingly simple, especially compared with a monster like C++. And, in this domain, the performance issues are much less important. I personally like Python best as a first language to learn -- it's what I'll be teaching my kids.

von Neumann continued ..., posted 13 Feb 2003 at 19:25 UTC by mslicker » (Journeyer)

People like to classify languages as either declarative or imperative. From my experience, it is hard to find a truely declaritive language. Even in a language like Prolog, which people claim specification meets program, you must know the operational behavior, how the proof search is done. Termination becomes an issue, as well as efficiency. An operational cut feature is used to trim the proof tree.

Backus I think is providing us with more a useful classification scheme. In the first 6 pages of his report, he gives a consise description of what he terms the "von Neumann style", I won't atempt to repeat it. However, characteristic of von Neumann languages are sequences of statements which act as transitions on a complex state. He describes this as word-at-a-time programming, the assignment statement being the most prominent feature. Interestingly he calls these languages "fat and flabby", which have a quite large set of framework features to compensate for the essentially word-at-a-time style. It seems this is still quite of true of languages being created to this day.

Where does Forth fit in? Forth, paricularly older ones, certaintly do contain remnants of these von Neumann languages such as control structures. However the basic model is function composition. Words are functions from stack to stack, and new words are defined as the composition of zero or more words. Forth has store and fetch, which Bacus terms as historically sensitive operations. Of course Forth can be extended for other types of programming, so it doesn't really stay in any one catagory, although it is esentially non von Neumann. It should be noted, Lisp and ML dialects also have von Neumann features.

I don't want to be misinterpreted, I'm not trying to contribute to a language flame fest as raph seems to sugest. Hopefully, people can utilize this information to look at programming languages in a new light. I think people generally exchange one von Neumann language for another without really testing the waters else where. I realise though, language is often mandated, and the choice is not the programmer's to begin with.

Java and teaching, posted 13 Feb 2003 at 19:50 UTC by nether » (Journeyer)

"Reasonable for teaching programming" forsooth. Here's the simplest possible Java program:

  class Hello {
    public static void main(String[] args){
        System.out.println("Hello, world!");
    }
}

Let's see how many concepts we encounter: class definitions, method definitions, access modifiers, the static modifier, return type declarations, the void type, parameter declarations, the String type, array types, command line arguments, the System class, static fields, the PrintStream class, the println method and string literals.

I don't think any teacher is going to go through all these concepts before letting the students read and write their first programs. So the only option is to say "ignore all this extra stuff for now, we will deal with it later". But in effect this means "you aren't supposed to understand this program, don't even bother". And this is extremely demotivating for a beginning programmer.

Java just has too much crud obscuring the real essence of programming from students.

Too much crud?, posted 13 Feb 2003 at 20:32 UTC by Nelson » (Journeyer)

Java just has too much crud obscuring the real essence of programming from students.

I don't remember hitting post., posted 13 Feb 2003 at 20:37 UTC by Nelson » (Journeyer)

Sorry about that.

Too much crud compared to what? C? Hello world in C has all of that same stuff except the class but use include headers. C++? Same plus a namespace. Pascal? I'd say it's about the same.

The one thing I liked about Java was that it's so light weight (typing wise) to create new classes. It has always felt to me like doing it C++ was much more involved. You create a header, you define your members, you create an implementation file, you implement your members... It just seems like a ton of work when you're in the thick of it all.

On Backus's classification of programming languages, posted 13 Feb 2003 at 21:06 UTC by exa » (Master)

I'm glad that my flame-ish comments generated enough interest for a useful discussion :)

I didn't re-read the lecture but of course I remember it as it has a special place in the history of programming language religion wars :) [Hmm, I should have a look at it now]

What he said was writing in an imperative language basically consisted of sequencing the ins and outs through a pipe which is the bus between processor and memory. Stuck there at the Von Neumann bottleneck.

The unavoidable conclusion is that imperative languages strongly favor a particular computer architecture since the most important thing in the language, assignment operator, models data flow over the bus.

On the other hand, it is possible to design a functional specification and let a translator decide how to compute it on given architecture. That is a more high level approach and it has the advantage of being architecture independent. Ideally, functional languages would free programming of the low level approach of imperative languages and allow us to write code that would perform well on various architectures.

However, it should be noted that today: 1) The functional languages aren't necessarily easy to learn or use 2) They are not yet truly tested in parallel computers (although there are things like ghc for pvm)

On the other hand, functional languages are equipped with a lot of other great features such as advanced type systems and elegant semantics to achieve a much higher level of abstraction than any imperative language so much of what Backus said is very relevant.

Still, it remains to be seen whether there are even better ways of programming (remember programming by example page at MIT?)

Less crud, posted 13 Feb 2003 at 21:34 UTC by nether » (Journeyer)

I'm not saying anything new here, but still...

Less crud in hello world. These are complete legal programs:

Scheme
(display "Hello, world!\n")
Python
print "Hello, world!"
Perl
print "Hello, world!\n"
O'Caml
print_endline "Hello, world!"
Haskell
main = putStrLn "Hello, world"

Beginning to see a pattern? Next, class verbosity:

O'Caml
class hello = object
  method hello = print_string "Hello, world!\n"
end
Python
class Hello:
  def hello(self): print "Hello, world!"

...etc. If you try to argue in favor of Java by comparing it to C++ then, well... "There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy."

Which language for Computer Science education?, posted 13 Feb 2003 at 21:45 UTC by exa » (Master)

Maybe this is the subject for another article!

I agree with nether about Java distracting students from what is really important in a programming language. There are even worse ways students get confused about programming languages. I think most of the students think programming is "using that IDE" or other particulars of a Java environment they are given.... while all they should think about is semantics of a written program text. ASCII text, that simple. And an arrow that goes from that string to a meaning in their minds. Very important for Computer Science students.

My basic feeling after 7 years into OO languages is that OO is not a very important development in programming language semantics, and it is definitely not the core of programming either. The really interesting thing is that most languages, like Java or C++ have deficient object systems coupled with weak imperative languages so I think they are doubly borked.

Now, let's think what object systems are good for. They are good more for component/architecture mindset, for developing large programs, re-usable code, better abstraction, etc. And therefore they are good for doing system-related stuff like GUIs, XML, and what not.

Some part of computer science is really interested in that kind of stuff, there are software engineering guys, and then there are UI guys, etc. They would be interested in getting that kind of programs right, so OO has a significance in the practical world where most of the existing code would be C++/Java...

Nevertheless, real computer science is less concerned with system stuff. From the theoretical point of view a large percentage of all code written is triviality, just moving chunks of data from one place to another. For something to be of interest to a computer scientist, there must be an algorithm.

And there I think the requirements get a little complicated. I do think that students must be thought one good procedural language in which they can program algorithms. On the other hand, I think they should be thought a functional language like LISP or ML so that they know what's there in the world.

I don't think C, C++ or Java are good procedural languages. They have a horrible design and I think a programmer is good only when he recognizes how ugly and disturbing C derivatives are!! (So I do think that I'm a good programmer, hmmm) In fact, C sucks in almost every major aspect of design such as orthogonality, type system, modularity, etc., maybe does a little good job at readability/writabiliy bit when used right.[*] The ideal procedural language would be ALGOL 68 for me but I can't think of a widespread compiler for that right now, so it seems that Pascal or Wirth's later languages would be a better choice. The difference between Wirth and Stroustroup? The former is a language designer. (It's said that the latter is a mathematician but I don't know if he contributed anything useful to mathematics. Though he has written a lot of industrial code)

Anyway, for the functional language COMMON LISP might actually be a little confusing so Scheme might be prefera0ble in the LISP family. I definitely wouldn't have any freshman use Haskell before it becomes mature enough. (But for a 3rd grade course it would be fantastic). In the ML family "caml light" would be a good starting point and with a great language processor things can get only that good for a new student. I think that's the language that will have him feel a little magic beneath his finger tips.

So, what does a freshman need to learn first? Thinking about it: 1) Type system 2) Literals 3) Variables, binding/scoping 4) Functions 5) Compositional semantics 6) How to implement an algorithm... 7) I/O etc...

Thinking about it either they should be thought one procedural and one functional language or a language that supports both paradigms (that is NOT C++) like caml. ML family being quite orthogonal, it's also possible to take a subset of caml and use that easily.

Even before that, something like LOGO would be useful and fun. I think the ability to do some graphics does a good job at drawing interest to the subject matter.

Enough for now ;)

Comments welcome,

[*] I think Java fixed modularity, C++ couldn't.

Java, the Chicken of Tomorrow, posted 13 Feb 2003 at 23:06 UTC by crackmonkey » (Master)

I recommend all Javur bashers read The Language of Tomorrow by Miles Nordin. It's an amazing piece of writing, and was featured rather heavily on the crackmonkey list a year or so ago. I love these tidbits especially:

If Freenet were a C program, it would have been picked up by all the Unix package collections by now, and would be just as easy to install as lynx or mutt. Since it's written in Java, it's a portability nightmare, and only a small inner circle has gotten it almost-working. Java's decoy claims of portability have in effect killed the Freenet, and dragged the Freenet architecture down to the same level of broken fantastic promises that Java makes. ``The mythical Freenet about which we have heard so much.''

If Java itself is portable, then why isn't there a portable way to install and run a Java program without dealing with spaghetti .class-files, setting CLASSPATH, and referring to arcane modules contained within .jar files? Why do we have to use a Unix shell script to start a supposedly-portable Java program?

C bashing, now?, posted 14 Feb 2003 at 00:04 UTC by ncm » (Master)

It's always been very fashionable to heap criticism on C, and (when the time came) on C++. For some reason, though, they keep seeming to be the most practical tools to use for the hardest jobs. You can't, in good conscience, chalk that up to historical accident (as you could for, e.g., MSWindows) -- their adoption has always been driven by technical reasoning, and usually in direct contravention to management, and even federal, decree. (Does anybody remember Ada?) C won over Pascal and its derivatives, and C++ won over the other languages that came out around the same time (e.g. Ada, Eiffel and ObjC) fair and square, even starting at a disadvantage.

There must be something fundamentally right about C, and C++, at least for programming von Neumann machines, that any language you hope to replace it with had better share. Whatever it is that they got right, it must be *really* right to overcome all the deficits that are so fashionable to keep pointing at. If your language doesn't have that thing, then no matter how elegant you think it is, or even how elegant it really is, too bad.

When Alex Stepanov designed the STL, a subset of which was adopted into the Standard C++ library, he tried first implementing it in many languages, including Lisp, ML, and Ada. ML and Lisp were capable, but were not possible to make run at more than a small fraction of the speed possible in C++. (See.) He took that as to indicate a fundamental kind of power not only in the C++ template system, but in the abstraction represented by C pointers that he was able usefully to extend to all other sequences. (Pointer analogs, called "iterators", fill a role as pervasive in the STL as lists in Lisp.)

I don't know if C pointer semantics are the magic juju, but I do know that dwelling on the faults C and C++ doesn't teach you much about why they have been so successful anyhow.

(Getting back on topic for a moment, what Stepanov says about Java in the article is also interesting.)

Maybe the problem is that nowadays people only build machines to run C code fast, and that habit handicaps other languages. If only some other language were King, machines would be different and C would be at a disadvantage on them. Curiously, on the Pentium 4 the canonical C statement "while(*p++=*q++);", compiled naïvely, is said to be very slow. In fact, most machines have had some variation on this problem -- an unfortunate stack frame layout, awkward shift or integer-overflow semantics -- and compilers have just had to work around them. CPU architects have been as subject to C-bashing rhetoric as the rest of us.

So, getting back on topic again... whenever problems with JVMs, or the Java libraries, come up, people are always quick to say those failures are incidental, and don't indicate fundamental problems. After all this time, if those incidental problems still haven't been resolved, maybe it's time to consider whether there's something fundamentally wrong that (as with the Sirius Cybernetics Corp's artificial people) is merely masked by the incidental nuisances.

I think maybe that's what Exa meant to suggest.

Java's importance..., posted 14 Feb 2003 at 00:51 UTC by chalst » (Master)

...IMO is that it convinced the world of the importance of VMs. There are problems with the JVM, but it is indubitably both intellectually one of the seminal virtual machines, and practically possessed of many great strengths. I don't think so much of Java the programming language, but the JVM Python interpreter is very mature. There is even a good scheme implementation. All we need is a good Common LISP implementation...

...continued..., posted 14 Feb 2003 at 00:54 UTC by chalst » (Master)

Also it convinced quite a few people that a practical programming language could and should have GC.

On C, C++, Java and popularity!, posted 14 Feb 2003 at 03:11 UTC by exa » (Master)

While I did mean that there might be fundamental problems with Java language and JVM specification I don't think you can explain C/C++'s widespread use due to successful language design.

Popularity does not necessarily imply excellence. It does imply being adequate though, and that's why I think there aren't many important Java apps around. It's simply not good enough. Maybe it will be in the future, maybe not, but for many years it has not been so and this isn't very promising.

The success of C and C++ is adequacy. C was simple enough and it worked well. C++ brought some advanced concepts down to mainstream from research labs and that is applauded more than is necessary I think. As a result, people could put their hands on this stuff and used it to write larger apps. And when some language dominates, people don't look for others too actively. Maybe most people want to learn only one or two languages. Look how most programmers don't even understand all important features of C++, or write terribly bad things using it (Ever used MFC?)

C is basically a half-cooked procedural language that was designed to be sort of a portable assembly. That's all it is. Look how preprocessor macro laden it is. It's just assembly. It's that crude.

C++ generics is a joke. Just because Alex couldn't get it right doesn't mean C++ is the perfect combination of efficiency and genericity. C++'s template specification and the terrible implementations do not equal to genericity of any sort I would expect from a modern language. And what use is genericity in a language without proper modularity? It's really pathetic.

Besides, I'd like to remind ncm that pointers are not unique to C or its descendants. They are not a big thing either, it's just an abstraction of address registers and storing addresses in assembly.

Going back to STL iterators, that must be the worst software abstraction in the entire world. Could it be done better in C++? Maybe. Would it be worthwhile? I don't think so. So, STL still might be a good thing for C++, but it is not a crowning achievement in writing a package with some data structures together with some algorithms.

Especially, the so-called functional aspect of STL is not very usable in practice. It is useful as an imperative tool, though.

I myself had a chance to implement the infinitely stupid idea of "template expressions" for a parallel array library and found myself wrestling with a mountain of unmaintainable code.

There is something else I'm trying to say, I think.

Wake up! It's 21st century! C++ is dead, long live ocaml!

Thanks for listening!!

ncm: BTW, Alex's work is old, posted 14 Feb 2003 at 03:29 UTC by exa » (Master)

I'd read the HP paper. I had found it interesting 5-6 years ago (Ah, that's where STL came from!) I was a great fan of allocator stuff and 0 overhead principle but it has lost its appeal to me something like 2-3 years ago when I began to interpret the whole STL thing as another template hack. It of course is a useful thing, and I use it all the time when I'm writing C++ apps, possibly putting it to better use than most of C++ programmers but I don't think it's the ultimate programming approach.

I definitely disagree with Alex on that "iterator" is an important concept in Computer Science. No! I don't see why you should even be bothered with iterators when you have higher order functions, it makes 0 sense (to anybody who has some experience with ML family) Iterator is an important concept in STL, of course, because they wrote it like that.

Iterators: Signs of Weakness in Object-Oriented Languages, posted 14 Feb 2003 at 14:48 UTC by dan » (Master)

Henry Baker in "Iterators: Signs of Weakness in Object-Oriented Languages (1993)" makes the point that iterators depend on internal state and side-effects - this makes it very difficult to prove anything about them, and also creates problems using them in multi-threaded environments, unless the compiler can prove there is no sharing involved

http://citeseer.nj.nec.com/55669.html. Source links appear to be broken, but citeseer's cached versions are still available.

JVM, C++, CL, GCs and Cache, posted 15 Feb 2003 at 06:38 UTC by mdanish » (Journeyer)

chalst: Mapping Common Lisp semantics onto the JVM is an extremely difficult problem, since JVM was designed to cope with relatively powerless languages such as Java. Scheme has its share of problems too; I don't think there exists a Scheme->JVM compiler that completely implements call/cc, for example.

ncm: You can't even begin to compare C++ STL to Common Lisp. Why would you even implement such a thing in Lisp, when Lisp already has much better facilities? And lets you take advantage of features that C++ programmers can't even dream of?

As for GCs and cache; I have two words: `fragmentation', and `copying'. The former with regards to manual memory management, and the latter with regards to GC.

Can we move this to comp.lang.java.advocacy ..., posted 16 Feb 2003 at 23:13 UTC by robilad » (Master)

... where it belongs, please ? ;) It's already degraded into 'my favorite language is better than yours', and I'm only waiting for someone to claim that Prolog is superior to PL/I ...

&quoWhy don't C++ and free software mix?&quo, posted 19 Feb 2003 at 03:03 UTC by sye » (Journeyer)

from Guillaume's link, dig out this old article #207 on C++, published Dec 2000 by egnor. Buillaume says "I think the reasons why C++ and free software don't mix too much are mostly cultural". I think Java is popular mainly because people majored in English or studied English as a second language need to write something for a living.

Huh?, posted 19 Feb 2003 at 03:30 UTC by atai » (Journeyer)

sye, can you explain in more depth what you meant?

CLisp, Scheme for JVM is a solved problem, posted 19 Feb 2003 at 09:33 UTC by chalst » (Master)

mdanish: There are R5RS scheme implementations in JVM bytecode out there (SISC seems to be best-of-breed), so what you say is false. I can't think of any reason why a CLtL2 compliant would present any real difficuties, but I have not yet heard of one.

CLisp, Scheme for JVM is a solved problem, posted 19 Feb 2003 at 09:34 UTC by chalst » (Master)

mdanish: There are R5RS scheme implementations in JVM bytecode out there (SISC seems to be best-of-breed), so what you say is false. I can't think of any reason why a CLtL2 compliant would present any real difficuties, but I have not yet heard of such an implementation.

Re: Java, The Chicken of Tomorrow, posted 23 Feb 2003 at 14:38 UTC by aeden » (Journeyer)

crackmonkey: Its called an executable JAR and it makes executing a Java application as simple as either double clicking the JAR or executing the following command line:

java -jar MyJar.jar

Everything is available in the Java toolkit for writing good applications. More often than not it is the programmer's fault when any code works poorly.

Teaching, posted 23 Feb 2003 at 14:43 UTC by aeden » (Journeyer)

nether: I agree that Python is really good for introducing programming to students - I have used it in such a role and have found that the students appreciate its clean syntax. You can't beat

print "Hello, world!"

can you? However, I don't think that Python's OO support holds a candle to Java. Java starts to shine when you are creating more complex applications which must be extensible yet restricted by a specific contract.

Iterators - an abstraction of lists, posted 25 Feb 2003 at 19:25 UTC by johnnyb » (Journeyer)

Iterators are very interesting because they are abstractions of lists. This allows what would normally be list operations to operate on more dynamic structures, such as input, the digits of pi, or the output of other functions. List operators usually have to rely on fixed-size lists. Lazy lists may be an exception, but I would have to look into it more (I have not done a lot of research into lazy languages).

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page