Name: wen lai
Member since: 2000-07-22 18:48:02
Last Login: N/A

No personal information is available.

Recent blog entries by wen

25 Jan 2001 (updated 25 Jan 2001 at 20:31 UTC) »

An idea to bring together all the different people and all the spoken/written languages in the world:

A universal dictionary that focuses on the meaning rather than the word. A dictionary that gives a universally unique ID to a very specific definition/concept that is normally represented by a word. Then different words from different languages that matches precisely to the meaning will be referenced to it, with similar/related terms also referenced to it. In any case, the definition is explained in every language possible, but not every language needs to have a word that represents the definition. For example, the French word terroir, as I understand, has no precise equivalent in the English language, or the Japanese word umami, for a taste that is imparted by MSG, doesn't have equivalents in most other languages either. We can possibly explain it, but some language somewhere in the world has a concise term for it.

Why should we try to do this? Well, my reasoning is to have this dictionary available to be almost like a modern day rosetta stone that will allow us to translate information and understand each other more easily. With a dictionary such as this, the definition, and not the word is the focus, and therefore it is less likely that things "get lost in the translation" or lose their significance in context through time (Actually, I think I'm probably wrong on both of these).

The dictionary will also be extremely self-dependent and recursive (meaning, the terms to be used to do the explanations will need to refer to other entries in the dictionary, and at some point, you need to have intrinsic words that are simple and atomic in concept)

Some of the problems would obviously be political and how to differentiate subtle meaning differences and how to handle imprecise and precise synonyms (meaning completely the same meaning or slightly different meanings).

This dictionary can be very useful because unlike normal dictionaries that try to match words, it will match meanings, and therefore be much more precise. It can be used for programming, for example, where words can be easily substituted for a different language by using that language's word/representation for that meaning, and the program only needs to specify which meaning to use in its place. There are a few caveats here too, of course...but for the most part, I think it could work pretty well.

Oh, FYI, my idea was inspired by reading this article on

7 Dec 2000 (updated 7 Dec 2000 at 14:50 UTC) »

Since I can't post a reply to the software reliability article yet...

I think that people are comparing apples and oranges when comparing things like microwave and VCR with PCs. These are very simple, single/few-purposed devices that have predetermined components that are likewise as simple. The PC is not that.

The PC, to me, will always be a multipurpose, mass- distributed experiemental device that happens to serve end- user purposes. This is why hackers love working with PCs - PCs can be easily programmed and reconfigured and upgraded to do a myriad of things. The environment will remain dynamic and upgrade-envy and commercial marketing efforts and (a small part) the need to constant and continuing effort to improve upon the past.

What I am saying about software reliability on PC or similar multipurpose, multifunction platforms is that it's damn near impossible to get the kind of reliability that VCRs and appliances can achieve, of course, because PCs and such are always a work in progress. They are not an end product.

I do believe, however, that some more end products will be coming soon. Using the PC to emulate an end product will be ok, but more specifically, people may be be more readily accepting of network computers if the network computers can do much of what they need and if the computing facility is offered to them as a service (yes, I'm aware it's like Microsoft's proposed subscription model for software, part of that .NET thingamajig). I believe at some point, there will be very little you can add software like word processing programs or spreadsheet programs that will be more than a slight incremental improvement. In other words, the product "matures" and once that happens, it is ripe for being implemented as an "end product"

This whole idea hinges on computing power as a service (like a utility like electricity or telephone, etc). Much like the Lou Gerstner's quote in this xmlhack article.

Bottom line: PC software reliability is not purely a programmer problem as it is a complexity issue in a world where the platform being used is a rapidly and constantly moving target. If we want software reliability, we need to nail down snapshots of hardware and software and produce end products as such.

Of couse, simplicity in any kind of software will always go a long way towards reliability. I'm sure the software on airplanes and space shuttles work reliably, and on much older hardware - they are end products.

Idea: Universal Psuedo programming language using XML

I wonder how hard it would be to create an XML application that could encase and express any programming language concept. A universal pseudo programming language that can be specified in XML, that is. It may be a lot of effort for no reason, but I think that the open source community, more than anyone else, would benefit from it.

All programming languages have their own syntax(es?). But when you cut through all the difference in vocabulary and stuff, the grammar is roughly similar. The biggest differences you might encounter would be object-oriented vs. procedural programming. I'm sure there are others.

But even object-oriented programming is procedural in its methods. OOP has different conceptual structure, but the basic elements - variables vs. attributes, functions vs. methods - are similar enough that we can treat them as roughly similar.

Enter psuedo code - the language neutral way to express some programming concept/idea/algorithm. We specify some programming concept using pseudo code as a service to those who are not familiar with the programming language we are using. Also, it eases the programming language prejudices in the programming community. The psuedo code should be even easier to understand than the high level programming languages that were designed to make it easier for English speaking people to understand. Implementing the pseudo code in a specific programming language is a simple matter of someone taking the time to put the concept into code.

If there was a way to specify pseudo code in a consistent, standard format, then producing the code for a specific programming language could be pretty trivial (or so I assume - this will haunt me, I'm sure).

Enter XML. Ever since I started learning about XML, I became thoroughly enamored with XSLT. What a great, powerful concept! I thought that the cornerstones (for me anyway) of what makes XML great were:
1. portability/extensibility/interoperability,
2. General purpose parsing,
3. Transformation,
4. Validation/DTD/Schema enforceability.

So I thought, why not express and store pseudo code or any other programming concept in XML? If an XML application were created with the appropriate DTDs or schemas created, the XML-stored pseudo code should be reasonable human readable. I think even basic syntax can be checked using XML parser's facilities. The most important thing for me, however, (this was the point that I've been trying to get to all this time) is tranformation.

By createing appropriate XSLT files to transform the pseudo code to code for a specific language, we gain in several ways (and maybe lose in a few):

- Libraries of algorithms can be stored in a language- neutral way, and reconstituted (just add water!) into any specific language just by using the appropriate XSLT transformations.
- Encourage code sharing across not only platforms, but also across programming languages.
- Code can be (I think) checked for rudimentary syntax problems.

- Programmers may become lazy and not want to understand a piece of code. Just transform-cut-paste.
- I don't know how difficult it is to implement such a system that would be able to effectively accommodate any programming language past, present, and future.
- The psuedo code may be more difficult to read by humans (ah, but wait! There could be an XSLT transformation that can be used to produce more human friendly output!).

I don't even know if the whole idea is sane, but it sure seems cool to me. I'd love to work on something like this. I would also love to see what other people may think of this.


wen certified others as follows:

  • wen certified miguel as Master
  • wen certified nymia as Apprentice

Others have certified wen as follows:

  • nymia certified wen as Apprentice

[ Certification disabled because you're not logged in. ]

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Share this page