Plan for a Corporate Desktop

Posted 12 Aug 2004 at 11:30 UTC by scrottie Share This

People often assert that maybe, perhaps Linux is acceptable for the desktop. Having used corporate desktops at a dozen companies and having the priviledge of designing IT from the ground up I have a few things to say here. Microsoft Windows on the desktop has problems, but I contend the things that it completely fails to attempt are more important than the things it does badly.

I'd like to share some practical insights from my experience rolling out corporate desktops. I've climbed the ladder, going all the way from the bottom, and I've seen the Good, the Bad, and the Ugly. I'm a better person for it, and without these lessons, I couldn't possibly understand what's wrong with the modern desktop, or how to do it correctly.

<h3>Bashing Microsoft</h3>

The purpose of this article isn't to bash Microsoft (just like the purpose of taking a walk in a blizzard isn't to get lost - it just kind of happens). My intention is not to offend. There is nothing fundamentally wrong with Microsoft's technology (or any other thing - things are amoral) unless people confuse what it is and expect it to be something that it isn't, causing grief for themselves and others by operating with irrational beliefs. (and the marketing folks are of course worthy of some blame).

<h3>Using the Bad Desktop</h3>

My first experience of using "desktop" computers designed for office workers to do productivity-ish tasks was not long after Windows 3.11 came out (whenever that was). The backend was a Novel install which served files. People wrote memos and saved them in shared folders. They edited spreadsheets and saved them in shared folders. They ran DOS based programs that updated shared databases. A few people ran a DOS accounting package, some others a DOS order fulfillment package. PHBs got reports run off on line-printers. The scope of "Information Technology" in the company was very limited and entirely determined by what a few off-the-shelf (but expensive) programs did. I think the human resources people had a program, too, but we stayed away from them - the power of life and death creeps us IT people. I was hired to help manage the Windows 3.11 desktops.

When I signed up there, I had been MUD'ing for a good long while (Multi User Dungeon - text based multi-player games). I'd run around the states, pretending to be a student in a dozen major universities and countless small universities and colleges. I used on a daily basis email, ftp, finger, gopher, telnet, knew a little shell scripting, and was passable with C. I was used to hunting down where people were logged in and going to them physically; or sending off a quick email for a quick reply, or a slow and slavishly constructed email for careful consideration; hopping on IRC or such things for background chat with my coworkers; running programs that other people had installed in their home directories.

At this job, people did only the last of these things - run programs from shared directories - and this was possible with many, but not all, DOS programs. (These DOS programs ran on the desktop rather than the server - I was used to either telnet'ing into a server and running it there, or else running it on a Unix workstation that NFS mounted /usr. Windows 3.11 and later versions all had a concept for "network installs", but for various reasons, this isn't usable, so I'm proceeding as if it doesn't exist). Even this thing - running programs from shared drives - Microsoft was moving away from with Windows 3.11. Windows 3.11 was cute. The primary thing it did was present a GUI interface, and let developers write applications to do it. Secondarily, it ran DOS programs in DOS boxes, timesharing between them. As far as networking or the corporate (or educational) environment, Windows 3.11 did nothing. All of those things that I outlined above that I did on a daily basis just plain weren't possible. Windows 3.11 caused me absolutely no grief whatsoever. No ones system worked correctly or did anything useful and no one expected it to start doing so any time in the future, and therefore didn't expect me to make it do so. Users had simple requests - "I need to write a memo and print it". No problem, I'll show you how to do that on your machine, and failing that, I'll set you up on someone else's machine. Back then, users weren't glued to the tube - it was common to have someone else in your cube using your PC while you sorted some papers or read a print-out. Using someone else's computer was no more odd than using someone else's phone.

I didn't think of this company as being ignorant or backwards - in fact, I didn't think much at all about computers in an office setting. The things that I did with computers, while useful, had too much of a learning curve. Would any of these people, or indeed any office workers, ever learn to chat over their computer using something like IRC? Ha, never! Not in a million years! I was a hobbyist, they were, well, office users. And even if someone put the same amount of devotion into making these office machines useful as I had put into making my Amiga 1000 at home do cool stuff, what could they possible do that would be so damn useful? And I wouldn't waste an Amiga on office tasks - spending $10,000 to buy a DEC 3100 (MIPS RISC Unix workstation - mmm!) to do anything other than research or run shell accounts for hundreds of users was silly. That's 5 times as much as the 386 machines that they weren't making use of. Here were people that could barely manage a photo copier barely making any use at all of an extremely limited system. Windows was cute - cute like a baby alligator I'd learn much later.

<h3>Using the Good Desktop</h3>

Many years later, in my mid 20's, I'd go to college for real, after logging probably 6 years pretending to be a college student. I got a student job at a very cool place (and one day, I'd like my job back, please). The server room had the machine that Gopher was developed on, and ran the first Gopher server. That's right, the University of Minnesota! It was a business incubator type thing, that specialized on doing pretty standard IT-ish projects but doing them using bleeding edge technology - such as Java 1.0. (Yes, they made some correct calls as far as technology goes). There were a lot of servers - some of them ran Solaris, some A/UX (Apple's old Unix with a MacOS layer - I never got to use one of these beasts, but they're widely regarded as one of the coolest Unix experiences you can have - all of the Unix config was done in a Mac OS 7/8/9 style GUI!). I think there was an NT 4 machine in there. On the desktop, people ran - whatever. Solaris. Windows. SGI. The guy heading up this unit was a big Mac head - had the Newton, PowerBook (perhaps when they were still 68k), and so on. These people did everything I did with computers a whole lot more - they were hackers in the best sense of the word - and they were damn productive developers who knew their stuff. People ran X applications and displayed them on their desktop - either from unofficial servers in the closet, other Unixy desktop machines, or from the main servers. Most people primarily used shell - they were using ssh long before it became popular, and before OpenSSL started - it was back when *the* ssh implementation was commercial open source software. The whole office sat on IRC (unless you got really busy and needed to avoid distraction). People ran mh or whatever or email. There was data and custom scripting everywhere. It was extremely common to hear on the IRC channel, "where's the script that does X?", where X is some random task. The most important thing was everything was accessible (in a matter of speaking - these people were also security fiends). Data was not left on a harddrive on a machine that was shut down or unreliable as a server. This was the embodiment of the Unix "tools" approach applied to a team of people - each person wrote tools, everyone used the tool, and Unix itself was the platform on which multi-user tools could effectively serve multiple users.

<h3>Deploying the Bad Desktop</h3>

Immediately afterwards, I moved from Minnesota to Arizona (sex) and have to find a new job. I took up with a temp company that does IT stuff and find myself with a bunch of Microsoft goons - but they had mutated. They weren't the mild-mannered, computer hobbyist, part-time-accountant that's trying to help middle aged ladies write memos - they were mean spirited, self-important control freaks who didn't give a damn whether or not you could figure out how to write a memo, nor whether the computer would let you, and detested that you couldn't figure it out, or that you "broke" the computer.

Microsoft Windows, now NT 4.0, was the same terminally confused, befuddled, fragile thing that 3.11 was. It's a lot better to program on, and it comes loaded with a TCP/IP stack (the stacks for 3.11 were wretched - just you _try_ to use two Internet applications at the same time!). It tried to prevent applications from crashing the machine, but it itself went down in flames plenty to make up for it. Fundamentally, it still didn't run applications that someone else had installed, not on the local machine, nor remotely. You could do a network install, but very few programs supported that, and they were programs that were easily installed anyway. You could do remote sessions (forget what it's called - VNC cloned the idea), but then you had two discrete desktops, and Windows machine (other than Citrix and Windows Terminal Server) could only manage one desktop, so you can't write a program, put it on your machine, and have other people casually use it, nor could they casually run it on their machine, so only "really important" programs would get "deployed". The result is that no one ever ran anything other than Microsoft Office - with very few exceptions. Microsoft didn't create an applications monopoly by providing a better, more tightly integrated computing environment, they did it by continuing to sell something that's only marginally better networked than Windows 3.11. Registry data can be sucked down from a fileserver so that preferences can follow people as they move between workstations, security has been given a few jabs at, a new permissions model was created for fileservers, but otherwise, it's the same features and the same architecture. From my checklist of things - chatting live, sending email, sharing files, running applications installed by/written by others, locating users - applications had brought Windows up on email and chatting (again, I never thought I'd see office workers chatting live on the 'net). Files stored on the file-server were disorganized and detached from their applications as the applications only ran the client (unless the application was part of Microsoft Office), so the only data that got shared was Microsoft Word memos and Microsoft Excel spreadsheets with a little Microsoft Access thrown in.

Just as shocking as seeing office workers chatting online with ICQ was this companies feeble attempts to build any sort of office platform out of Microsoft Windows. At my last job, people used hundreds of little scripts crop up all over the place to manage workflow, keep databases up to date, report, and generally embody the vast business logic of the tiny incubator company. Here, I was working for a very large company that was literally powered by saving Microsoft Office documents in shared folders. They bought, while I was there, a program costing hundreds of thousands of dollars - that turned out to be a badly written Microsoft Access file with lots of Visual Basic presenting an application built on top of Microsoft Access. It was at this point that I realized that something was badly wrong, though it would be a while before I could put my finger on it. All I knew was that something just wasn't right. The company was glad to have this extremely expensive application, and patiently put up with it failing to operate when more than 3 or 4 users were using it, and with it crashing and leaving corrupted data, and having to bring it back from backups and people having to re-do things they've done as a weekly routine. How badly they needed this application, and how long it took them to find any company selling something fitting the requirements, outweighed anything that could possibly be wrong with it. They happily hired people to support it - like priests, they backed up and restored the sacred data. Meanwhile, tens of thousands of other little workflow and custom business logic applications sat in silent need, probably never to be created, probably never to be dreamt of (though I may be wrong - I'm still reeling from the shock of seeing "office workers" using ICQ!).

The the "techs" that had to support the Windows machines - that's a far less happy story. Computers became critical to business operations at some point, so machines had to work, and the techs had to make them work, but they were no less willing or able to perform than back in the 3.11 days. Users were far more technologically savvy, but the techs hated them more than any tech ever hated a 50 year old woman used to a typewriter trying to use Microsoft Word on Windows 3.11. These users wouldn't just use someone else's machine when theirs wasn't working until someone got around to fixing it a few days later. They wouldn't just go back to the last printed out copy of their data when their digital copy got corrupted. In order to try to keep Microsoft Word, Microsoft Excel, and Microsoft Outlook running (remember, the computing platform essentially *is* Microsoft Office now), the Chief Technical Officer mandated that people not be allowed to do anything to customize their computers - and the techs zealously enforced this, as it made their job easier. Never mind deciding to use a Mac as your desktop - it violated company regulations to change your wallpaper! And, rather than writing applications for clients, or internally for the company, my job function was to load up Dell Optiplexes onto large carts, wheel them into another building, remove them from the carts, load Dell Optiplexes that had just been "baselined" onto the cart, wheel the cart back into the other building, and plug them back into the mice, keyboards, monitors, and network connections at each workstation. (A large number of the techs would pay little visits to the trunks of their cars while moving machines between buildings). The "baselining" process was intersting - somewhere deep in the extensive bowels of the IT department sat a team of people contrived Microsoft Windows installs that were up to date, had all of the necessary software installed (Microsoft Office plus a few other things), and didn't have its dll's mangled from DLL Hell syndrome, or suffer from other afflictions such as bad data in the registry. This install image was burned to CD using Norton Ghost, and other techs, like myself (yes, this was a demotion for me from my past job), would either wheel machines in and out, or else plug them in at temporary workstations and blow away their harddrives with the "baseline" install on the CDs. If we found a machine where a single setting had been changed, we'd look up on paper (not using software) who had the machine, and report them. They'd get in trouble. Sometimes, people would install animated mouse cursors that would crash the machines - they'd really get in trouble. If you installed software, you're in serious hot water. The techs psychologically blamed the users for having to cart machines around and "baseline" them endlessly. There was a story going around that a tech found pr0n on a machine once and the person who had that machine was fired. Every month or so, without warning, an office worker would come into the office to find a different (but physically identical) machine under their desk, and all of their settings gone (left handed? Ha! Person who had the machine before you liked their clock set 15 minutes fast? Ha!). Of course, you weren't to leave data saved to your harddrive - it goes on the fileserver. And developing little one off applications to help you with your job - forget it. There was a specially designated IT department, and machines in there were not swapped out and baselined. They had their own procedure. I would later work there. By the way, they had one of those big-company special relationships with Microsoft, and we kept getting versions of something whose name I forget that let you push software updates to the desktop - each time it was used, it left about 1/4th of the machines (sometimes more, sometimes less) unusable. Numerous and strong efforts were being made to avoid carting machines around.

This might sound severe, but I've since worked at numerous companies that used Windows on the desktop, and they've all had policies (either written or enforced using software) about installing software on the machines. This was an extreme case, but not by that much - it's very much the same actions and same spirit as exists in every company. I'd also gone into a small company that used Windows primarily on the desktop (initially - by the time I left, it was a Linux shop), and they were desperate for some technological way to prevent their 12 or so basically computer literate employees from hosing up the machines and losing company data week in and week out.

These aren't companies that pine for something better - they're proud of using Microsoft products, they consider themselves to be extremely productive, well managed, secure, and modern. Their IT directors will proudly boast about how well Microsoft is running for them.

<h3>Designing the Desktop</h3>

I've had a sad chapter, a happy chapter, and then another sad chapter, so it's time for another happy chapter. For two small companies, I've been lucky enough to design the desktop. By the time I reaches this point, I'd thought a lot about what I did and didn't like. I was even so cocky as to decide that some things just plain didn't work. So, without further, ado, here's the tech specs.

Servers - Linux and BSD. All of the servers had cool names (named after alkaloid compounds at one place - grep -i 'ine$' /usr/share/dict/words). Various BSDs ran the firewall and mail server. Linux ran the fileserver and applications server. Desktops were - whatever. X based applications were pushed to the desktop from the applications server. Microsoft Windows applications (where no good free or native version was available) ran under Wine ( and were pushed over X. For you youngin's, X is a "network transparent protocol", so you can run a graphical applications on one machine and use the application on another. The "applications server" is the machine running the applications - all of the office and productivity software was installed there. The windows are all managed by your local window manager, or the window manager itself might run on the applications. And yes, Solitare, Hearts, and Minesweeper were all on the applications server.

Did it fly? Hell yeah it flew! Applications would be installed on the server, and added to the launcher, when it became clear that it was something that would be useful to more than one person. A lot of specialized image processing applications wound up there. People screwed up their desktops left and right as they are wont to do - but they were given an OS install disc and told to reinstall, which they would, and then someone would come by and set the machine to DHCP and load up the X server off the network. xinit ran on the server, and the X server running on the clients would find it and pull a login screen right up. Users would log-in, and all of their applications and data would be right the smack there still. Several people ran Macs (in the MacOS 9.x days), and several others ran Windows, and a few were converted to Linux. Everyone ran software locally, too, but as a the administrator, I wasn't concerned about them hosing their workstation, because, well, everything important was on the applications server or fileserver. They could even run applications locally on data mounted from the fileserver, where the fileserver was also mounted by the applications server. This office consisted of mostly semi-technical users, but there is absolutely no reason that thin clients couldn't have been deployed in a large company to serve non-technical users or users with very narrow job scopes. Even a user with a thin client would be able to (unless some Draconian policy dictated that technical measures be taken to prevent it) download software to the applications server and run it without effecting other users in any negative way. Obviously, the number of people playing Quake over X over the LAN should be kept to an absolute minimum, but there's no reason that you couldn't play Quake on a thin client without involving the sysadmin for any other reason than network bandwidth usage problems. Productivity increased as the system became more reliable (compared to Windows desktops alone) - not only did people not have to wait for attention from some tech, but they had much more productivity software installed, the backup problem was solved (something else that plagues Microsoft Windows using companies).


Microsoft Windows can't effectively run various applications locally and central application install for Microsoft Windows is an unrealized dream; Unix is unstoppable with thin or thick clients and people installing applications centrally and of course locally.

Things like Citrix, besides being expensive, don't solve the problem of letting users install their own applications in any way that can be sustained by any fewer than one system administrator per dozen users. Because of the inherent problems running an office that uses Microsoft products, the effective application base shrinks to Microsoft Office plus a few large applications and applications that may be created inside of Microsoft Access. Microsoft shops seldom grow any sort of custom business logic. The canned Microsoft desktops users are forced to use are inferior that what is offered by thin clients in terms of features, stability, and administratabiity.

Make no mistake - I've reached the conclusion, in this article and in my experience, that a software platform more diverse than just one office suite is essential to productivity. Specialized pre-written applications, popular pre-written applications, and small applications are all also critical. I haven't yet said how these custom, small applications get written - they're written by the IT department, and in my experience, this is exactly what the IT department in a Unix shop does. Rather than hauling computers around on carts to constantly reinstall and patch them, they have time to make users lives better, moving forward. And this is exactly what I did - write small helpful scripts to address business needs rather than core technical needs of keeping the workstation running and patched. A company that limits itself to an office suite dooms itself to DOS-era use of the network. Rather than hating their users for making them do banal, tedious work, a Unix sysadmin has a lot of potential for helping the modern equivalent of that 50 year old typist get her work done efficiently and with full advantage of technology. Unix administrators are well known for hating stupid users, but Unix administrators are almost never at the mercy of stupid users, having to fix their machines, or cart their machines around for reinstall. A Unix administrator can at the same time hate stupid users and provide advanced computing resources to useful users who are being productive and doing work for the company.

It's true that web based applications largely mitigate the difficulty of sharing applications on Windows platforms, but it's much more painful to write a web based application compared to a little Perl (or Python, or Ruby, or bash, or Scheme, etc etc) script that pops up a little Tk GUI or even just takes some arguments on the command line. Using the web, the user interface is much more limited, it's very difficult to display interactively, there are obvious problems with long-running jobs, programmers have to learn a bunch of HTTP gunk before they can start cranking out little scripts. Web applications, unlike things running at the shell account, need security added to them - shell scripts rely on the fact that you have an account on the machine and you're running as yourself. Things like Microsoft's ActiveX were poor attempts at solving the much larger problem I've explained in detail throughout this article - they solve only one small aspect of the problem (presentation), and then they do so with dire consequences to security and stability.

The only thing I'd do differently is deploy Linux more widely to the desktop - and run OpenMosix ( on all of them as part of a cluster forming the applications server, blurring the distinction between "desktop" and "server". The result would be near the "ideal" scenario, where every bit of CPU available went into running applications, applications could be centrally installed, users can install and run application and even share those applications, data is stored in central place where it can be backed up, and data is mirrored close to the user for quick access.

<h3>Conclusion</h3> Microsoft Windows is unusable on the desktop. It's unusable as a general purpose computing platform, and more closely resembles a thin client that's expensive and tedious to administer. It isn't suitable for non-technical users, semi-technical users, or power users in the enterprise, each for different reasons. It isn't suitable for non-technical or semi-technical users at home. It's suitable for technical users at home when they happen to like Microsoft Windows. Linux has a long way to go on the desktop - hopefully forever. There are an endless number of ideas that haven't been thought of yet for making machines better serve humans, and open source software (and Free Software) has positioned itself to make that journey. The strong reverence for standards, open architecture, modularity, network transparency (on many levels), and solid technical base are critical, but most important is the philosophical disposition towards helping the useful user (ignoring for the moment the idiot user who shouldn't have a desk job, desk, or job). Linux and other open options invite innovation from companies and individuals alike, both within a company and as a service on the free market. While the number of things configurable by GUI configuration out of the box is limited, and these things tend to have sub-par user interface design, the vast possibilities for configuration by config file more than settles the score for any self-respecting company hiring humans rather than baboons for administrators.

Perhaps I should make more clear, posted 12 Aug 2004 at 11:41 UTC by scrottie » (Journeyer)

In the old-old days, the primary job of IT was to write COBOL, FORTRAN, etc applications to serve the particular needs of a company, not to keep the dumb terminals running. While PC's are more generally useful than dumb terminals, it's sad that the shift has moved entirely to canned software and entirely away from custom software because the demands of the PC as a replacement for the dumb terminal are so great. In this case, I'd say pitch the PC and replace it with a dumb terminal. -scott

wow., posted 14 Aug 2004 at 10:45 UTC by lkcl » (Master)

GREAT article. makes it abundantly clear as to what windows users are missing, for simple day-to-day computer use.

... where do you see .NET fitting in to this? .NET is supposed to make it easy for businesses to offer services to their customers.

Wow, thanks, posted 16 Aug 2004 at 11:32 UTC by scrottie » (Journeyer)

Thanks... I'm glad for the chance to get some of this stuff off my chest.

I realized part way through the article that some of my starting premises weren't entirely correct - but isn't that always the way? The world is far more complex than any simple view can do justice to. One of the premises was that Microsoft wasn't even trying to do certain important things.

Microsoft had in fact attempted to tackle the problems of deployment. They must have worked very hard on the software I mentioned that was being used to try to deploy updates and software centrally - and concluded after a lot of hard work by a lot of smart people that it wasn't practical without architectural changes. I'm sure that, in the same way that Windows 95 worked overtime to support all of the dirty trickers that Windows 3.11 users used, that they were fighting a battle of vast scope. Running software off of the network too doesn't mesh with the model where a program expects to be able to leave data files strewn around, prance around the registry, and willy nilly overwrite DLL's to insure compatiable versions (for it, anyway) of everything. But Microsoft persued this idea vigerously with their own applications.

Microsoft didn't initially see the potential of remote execution such as with Windows Terminal Server or the original Citrix, but when Citrix licensed the Windows source code to customize it to support the multiple desktops required, Microsoft saw the value. Of course, every version of Windows should do what Citrix does, if they wanted to fight a feature battle with Linux, at least. ActiveX, come to think of it, is another example of Microsoft trying to realize network transparent, rich computing.

But that wasn't your question... you wanted to know how I thought ".NET" figured in. I don't have anything remotely authorative I can say, but I can speculate =) I think ".NET" (I hate too generic of names - such as "Windows", "Access", etc) is very practical. XML (SOAP/RPC) is the new RPC (remote procedure call), and this is seen as the future of making different machines work together. It targets high-level developers and companies that develop software, not users, which doesn't jive with Unix's model of targeting advanced users. I don't think tight XML integration will make software deployement or casual development easier, and it think that gets back to Microsoft's mental model of how a company should be, and is, run, which gets back to how Linux invaded the server room so successfully. Microsoft expects an IT department to coordinate every computational action in a company. IBM, in the old days, expected the same thing, and still kind of acts like that. In any adequate large company, nothing gets done except by coup. Microsoft expects IT to deploy servers, license operating systems, buy expensive software (like SAP - if you're ever informed of a company migrating to SAP, wait until 2 years after the integration project is past due and short sell), and then train brainless users how to use the great design. As programmers, we all know that the universe is too complex to plan a route through. That'd be like hoping in your car, remembering the route to get to New Mexio, and then closing your eyes and executing the maneuvers. This is completely doomed to fail. In the real world, work is done despite management, it's done without official resources, and it's done with short notice. Linux is usually installed on spare desktop machines and attached to whatever network connection works. A few years back, it came to collective attention that every major company ran Linux somewhere, whether they realized it or not. Like a sort of bigot, These companies would protest "we don't use Linux!" like some sort of bigot declaring "none of my grandchildren would ever marry a nigro!". Hah! How much control do you really have over your grandchildren? In much the same way, Enterprise Java and ".NET" do a great job taking the considerations of CTOs and IT managers into account, but ignore what the people "in the field" are doing with the computational equivilent of bailing wire and duct tape (Perl - the duct tape of the Internet!). I don't think (or I'm too ignorant to know) that ".NET" really enables the poor schmuck in cubesville to autonomously deploy an application that the poor users, with locked down desktops, can really use. The HTTP model doesn't provide a rich user interface, and web based applications are extremely limited in how they're able to talk to other applications (uploading CSV files and pushing/pulling RSS feeds are two examples - I've certainly worked on web projects where the users had to upload Microsoft Access databases or Microsoft Word files). Running on the same system, directly querying a database is possible, and piping data between two applications works.

O'Reilly publishes a book called "Perl for System Administration". It's an excellent book. It isn't huge. It's kind of medium sized, actually. It shows how to do all sorts of things with Perl - querying databases, filesystem work, sysadmin stuff with users and so on (of course), creating daemons, security topics, revision control, and a bunch of other things. It isn't marked as a "how to write applications" book, but for what people in the trenches need, it beats the pants off of the 2,000 pages worth of Java Enterprise etc books that cover the same topics with many more vagauries and open ended trailing offs. "Perl for System Administration" accomplishes this by gluing things together, spectularly demonstrating the Unix "tools" mindset. It's hard to have a "tools" thing going on when each user and each application is locked away on their own machine, you can't get a machine for your project because you're wanna-be skunk works, and none of the other machines, including the users machines, are even eager to talk to you. The only working design is one where data is available by default, once permission is established. (I'm not in the Python camp, so I'm using Perl examples here, but I don't mean to disperage Python - Python also easily does what people actually need done in a way friendly people in the trenches - ditto for Ruby). This isn't to say that Java is bad, just that it's catering to IT managers, not the poor saps in the trenches getting things done by coup.
But I said ".NET" is pragmatic - it gives up trying to address network transparent computing, but it fixes DLL hell, and it works very hard to address the extremely important question of security of networked (client/server) applications. I'm sorry, high level application code should not be written in C. And (as I said in a previous post here), the Unix folks have to play some catch-up here, or Microsoft is likely to pull ahead on the security curve. Java might not be much more expressive than C, but it has the best chance of being a universal Unix-land replaced thanks to gcj - it's compiled, arrays are length checked, ... okey, the list of problems is just as long, but that just shows the pickle we're in. All programmers won't be made security experts over night, so the only answer is to remove the pitfalls that they keep walking into - buffer overflows must die with C.

I'm just finishing a major project (it's a book! A book! A book!), and I'm toying with ideas of what to do when I'm done. I've been a consulting weenie for a long time now, and it's really wearing on me, emotionally and physically. Starting a business providing one fixed service might not be a lot better, but I feel like I should do something where I don't have to spend half my time trying to collect, argument about specifications, and generally grovel pathetically. I'm toying with the idea of sort of a "goto my PC" kind of thing (website that proxies access to your PC with VNC/Norton PC Anywhere like software), but instead, actually providing the desktop, too. I'd like to see people running filesharing clients to get their friends garage band out using a high-speed link, playing with loads of Linux software they might not have installed themselves, sharing documents with other users, building web pages using IDEs and visual editors running on the same machine, using graphical load tools to watch their CGIs and databases run, and all sorts of cool things like that. Of course, these things are possible now, but ISPs tend to have policies against leaving long running processes going, and they don't provide the X libs and headers, not expecting people to want to run X applications over the net, and so on, and so forth. They don't want you to get too comfortable on your shell account because, hey, you have your own computer, right? People don't want to piss off their ISP by using too much bandwidth and getting their home Internet connection removed - and in most areas, only cable or DSL are available, not both (from what I've seen), so this would be a mortal blow. People don't want to open ports on their firewall because they know they don't keep their systems up to date and know they aren't being as vigil as they'd have to be. They don't want to give their friends passwords to VNC running on their own machine because they likely store all kinds of data (financial, pr0n, other passwords) on their computer, and most people don't run Unix-like systems on the desktop anyway, so they're at odds with permissions and only one remote desktop is possible anyway. With a 2nd "virtual" PC on the net, you could store only the files that you didn't mind your friends seeing, give them access to your account (through group permissions or with your password), let them copy things in and out, let them come in add to your download queue in your filesharing client some great band they want you to hear, leave you a sticky note on your virtual desktop, etc. That's for the home user. For web development, there are obvious advantages to providing a nice GUI environment rather than just running ftpd, sshd, and httpd.

I know, I know... I really need to go to my Ranters Anonymous meetings...


More rantage, posted 17 Aug 2004 at 08:43 UTC by tk » (Observer)

I see prozac also posted something about the current `office' paradigm.

I still don't know anything really concrete thing about this whole .NET stuff, except that Microsoft says it's good. If I'm not wrong, in the good old days of text consoles, whenever one wanted some sort of `interactive' online service, one just had to telnet to some specific server and account. Now that we have SSH with X-forwarding -- the modern counterpart of telnet -- but somehow people aren't using it to provide interactive web services.

fundamentals, posted 25 Aug 2004 at 17:09 UTC by lkcl » (Master)

One of the premises was that Microsoft wasn't even trying to do certain important things.

when you sit down and use a windows system, you just know that something is fundamentally just not right.

the five or so people i have put onto KDE 3.2 who used to be windows users complained BITTERLY. one of them even went to the extreme lengths of HIDING his computer in the warehouse when it was his turn for it to be upgraded.

after three weeks, they loved it.

Eh Tu, Linux?, posted 31 Aug 2004 at 17:43 UTC by prozac » (Journeyer)

The premise, if I understand it correctly, is that Windows sucks and Linux rocks?

Or is it, isolated client PCs running Windows fails to live up to the Unix server model ideal?

"A company that limits itself to an office suite dooms itself to DOS-era use of the network."

I like that, but actually it could be stated more accurately as:

A company using office suites is doomed by the DOS-era use of the network.

What alternative is there?

I think that the best alternative is/will be one that goes back to the time sharing days as you eluded to:

"While PC's are more generally useful than dumb terminals, it's sad that the shift has moved entirely to canned software and entirely away from custom software because the demands of the PC as a replacement for the dumb terminal are so great."

But what kind of custom software?

"...but it's much more painful to write a web based application compared to a little Perl (or Python, or Ruby, or bash, or Scheme, etc etc) script..."

Having a good HTML/XML/??? API that a script can use to handle the "painful" stuff is an answer.

A Web Interface to a cluster of Linux PCs is an answer.

The death of Windows as the de facto Desktop is highly overstated and the rise of Linux as the new Destop is overrated.

As it is currently, on the Desktop, Linux too is doomed by the DOS-era use of the network.

Linux as Server is top dog. Windows as Desktop is top dog.

Windows as Server is a only a Linux wannabe.

Linux as Desktop is only a Windows try-a-be.

The Desktop paradigm was designed for a single User system.

All attempts to get many Desktops each running their own version-mismatched copies of programs to write to a single file/database on a network share is an accident waiting to happen.

Windows as a Game Platform is why there are only Windows in Staples, Best Buy, Sears, etc. etc. Millions of them.

Advogato is an example of an answer, posted 31 Aug 2004 at 17:52 UTC by prozac » (Journeyer)

I would like to add this little tidbit....

Millions of people can join Advogato, create accounts and projects, upload text, post data, etc. With more "script modifications" people could upload images, edit images, collaborate on documents, create interfaces, etc, etc.

The potential limits are our imagination only.

And these millions of people can join Advogato with any HTML enabled device. PC (Windows, Linux, Mac, BSD, etc.), Palm, PDA, Cell Phone.

Imagine that.

The technology exists.

2 prozac, posted 2 Sep 2004 at 08:45 UTC by Malx » (Journeyer)

Sorry, but ability to do something is not enough any more. It is an old (already) old thing :)

Now the rules are:

  • how fast you could do things (WebUI with page reloads are not realtime fast)
  • how nice and realiable it looks (it is not the case for WebUI)
  • how much of device capabilities you use (hey! the artice says about useless waste of CPU, but you propose HTML which will waste graphical capabilities of PC! no 3D no video animation! Even no use for right mouse button in browser for you own needs ;)
  • is it possible to customize software by user (for HTML it is almost never the case - all up to server side programmer).
  • etc...

And I can't join Advogato from my mobile, just becouse it will cost me too much (page is not designed to save every character - it designed for normal PC and ordinary Inet line). :)

Joel(?) says the thing he really miss in HTML UI - you can't do in line spell check (like in Word - it underlines wrong words).

We rock, they suck. But they have resouces for new prototyping, go figure., posted 5 Sep 2004 at 08:21 UTC by mirwin » (Master)

Information technology for the technocrats, the users should like what we choose to give them as we choose to give it to them and sing our well earned praises to the heavens for the eddification of all worthy peoples and alien civilizations now and forever, amen.

Unfortunately for information technology technocrats that know better than all executives, managers, and coworkers: existing businesses tend to have operations which provide revenue. These tend to be mundane things like engineering optics systems for mass production, hauling things to and fro, refining metals, mass producing card box boxes, printing, etc. etc.

The operations departments quite naturally want control of their critical resources. Since they generate the revenue they have a lot of influence in the organizational hiearchy. The first time an automated manufacturing floor produces thousands of incorrect parts (resulting in unrecoverable expenses such as wasted time and material and missed deadlines in all product lines depending on receipt of those parts) ) because a centralized database server hands them an incorrectly versioned DNC file; is when the effective managers start looking to re-invest in "expensive" local dedicated technology that their people control directly. Information Technology can then be used as a consultant who can be escorted off the manufacturing floor when they get too arcane, cutesy or unfriendly for later lambasting by soonest available lowest common executive.

Helpful friendly IT consultants can generally get budgets increased for unnecessary but potentially useful frills such as centralized servers, inter department networks, and technology prototyping if they can demonstrate they can help keep localized mission critical technology operating and that their new toys can be prototyped and tested in ways that do not adversly impact operations while possibly leading to improvements within operatioanl revenue producing work processes. In other words, a reasonable return on investment can be expected, delivered and, when audited, demonstrated.

Unfriendly ego centric technocrats generally do not have much water in the corporate community glass and when their fourth rollout attempt goes awry on a new wonder technology project they are often easily fired. This allegedly occurs frequently because a "clueless" profitable company just does not "get it".

Of course all of the above can be easily invalidated if you are wise enough to work for an internet information company which creates revenue from providing knowledge on demand to customers via the internet. However, I suspect the attitudes this thread illustrate and glorify contributed greatly to the bursting internet bubble. Customers will not pay to be told they are hopeless useless idiots indefinately, they expect the producer/vendor to be able to deliver useful results via whatever means necessary. When knowledge or information access is the mission critical product the old attitudes about reliability and availability quickly creep into the brave new world of the formerly well isolated and protected IT world of the infotechgeek.

An intersting omission in your diatribe. A user who is willing to push a couple of standard buttons such as "setup" or "install", live with the defaults as installed, and use the expensive shrink wrap software for its advertised (sometimes even tested) purpose can use a modern high end desktop to rival the capabilities of many large firms; regardless of the field of human endeaver; for anywhere between ten thousand and a million dollars worth of applications software (hidden cost of sufficient domain knowledge previously acquired somehow to operate such software should be obvious, do NOT under any circumstances let an idiot IT geek play with an x terminal based finite elements analysis package intended for production use by an engineer on an IRIX SGI or NT workstation and X-terminal based combinations thereof, you have been warned, I accept no conseqences or liability of any kind whatsoever incurred from misuse of this freely given alternate worldview). This "highend" user is typically a specialist in a non computer field. It appears to me that the only field of human endeaver where open source rivals this instant capability is systems and applications programming. In other words applied or theoretical computer science.

Consider the merits of a highly successful new coal mining paradigm that excels in inexpensively equipping new coal miners with excellent and improving coal mining tools but disdains to interface with ignorant incapable consumers too dumb to figure out how to transport it or get it to burn usefully. How high do the piles of coal get before it is uneconomic to mine coal? How many miners will continue mining for the shear joy of exercise when it becomes apparent that no one wants the coal they produce and they will have to burn it themselves? Incidentally, do we have any coal mining enthusiasts who know anything about power production or desulfurication for compliance with soon to re-arrive U.S. antipollution regulations? If not, hopefully they are willing to learn these new technologies for free and freely share the results of their labor with the still increasing ranks of the freely coal mining community.

On a positive note. ArtOfIllusion appears to be an excellent production grade modeling and animation package. It does not appear as capable as the expensive commercial Kinetix product 3DSmax but it is free and java based. 3DSmax ran quite well on Windows and only Windows 95 and was available eight years ago. Much of the appearance and development paradigm of ArtOfIllusion appears to resemble 3DSmax but there are elements of its user interface which appear much more user friendly. With the lower initial learning curve and its free availibility I would not be surprised to see ArtOfIllusion start to take over in learning institutions dedicated to producing new animation and movie effects specialists. Within ten or twenty years it could take over the movie industry! We could all be watching or purchasing low cost or free collaborative internet multimedia content! Death to all Hollywood producers and tyrants!

Of course the above assumes:

1. ArtOfIllusion finds and maintains a viable resource acquisition and reinvestment model. Either cash revenue, volunteered community resources, or some other oddball alien mechnism transmitted from Arcturus and decoded just in time by seti@home centrally controlled (but now open source!) distributed supercomputer or clone.

2. Kinetics (3DSmax) does not improve its customer service, price, production capabilities, etc.

To continue, while open source appears to be on or near the leading edge of computer technologies, it appears to me about a decade, possbily two, behind most specialized shrinkwrap domains. This should give it a larger advantage in static fields such as wordprocessing and bookkeeping but is a distinct disadvantage in other bleeding edge fields such as commercial special effects, or spacecraft engineering. Please do not misinterpret my skepticism regarding your theme of {non unix familar users are idiots who cannot successfully misuse technology developed for isolated desktops within a unix network either by themselves or with my awesome assistance}. To me it appears the open/free paradigm is spreading and is mutating to fit other domains. I am hopeful that someday it will be possible to purchase an open specification machine tool and beginning home manufacturing of the industrial plant necessary to manufacture a free personal spaceplane using free data and specifications available on the internet to any Terran interested in slipping the surly bonds of earth. In the meantime I do not expect the need for civility between producers and consumers, people, to diminish.

Let's face it. Since I can buy Windows software and type or click setup, I do not need to put up with surly technoinfogeeks at my personal business desktop. With peer to peer I can avoid central servers if necessary. Further, since I have decided to learn the open source paradigm and apply it to other domains of personal interest, in the future I will not have to put up with self serving sabotage from Microsoft, Kinetix, SGI, Sun, et. al. I will have to learn to be polite to authentic open source developers who are producing useful tools but that is not difficult. They are generally knowledgeable pleasant people who do not expect me to know everything they know or agree with all their personal beliefs. They are easily identified because their documentation is generally readable and usable and their software is generally clearly labeled regarding its production stage, the better to not waste my precious time. If they working in domains outside of computer science they are often eager to interact pleasantly and exchange information regarding domain specific software, its use, development, etc. Bugs are often authentic development challenges regarding domain specific knowledge or technology issues vs. sloppy code or library interdependencies.

The quaundary that grumpy technoinfogeeks will face in the future is that open source projects robust and user friendly enough to attract sufficient talent to succeed within the open source community will be easy enough for non computer scientists or non technoinfogeeks to get started. This will encourage noninfotechnogeeks to learn enough geekspeak and domain applicable computer science to get by without infotechgeek assistance. The only viable open source business model I have seen (providing services) will still depend on providing friendly, productive (in the view of the customer) services. Exit the grumpy idealogue infotechgeek or P'hd in computer science who cannot be bothered to interact pleasantly with other people. Of course this applies to space scientists and engineers who disdain the ignorant masses who currently know nothing of space physics as well. When enough open source developers (or aerospace hobbyists) get interested in space to develop the tools necessary to get to space cheaply and learn sufficient engineering to apply their advanced computer science to building cheap inefficient spacecraft that sort of work then NASA's days of centralized glory will be numbered.

Seriously. You might laugh at this "vision" but consider this: The U.S. is seriously considering building an SDI shield. Fortress America. What is necessary to defeat this? Cheap ICBMs. Lots and lots of cheap ICBMs. If North Korea, or China, or Russia, or Israel, or Iran, or NATO are serious about having a second strike capability sufficient to deter the U.S. from throwing its weight around in negotiations; then they should be considering how to get lots and lots of cheap ICBMs (or hypersonic cruise missiles) sufficient to overwhelm an SDI system. What better way to develop cheap cruise missiles or ICBMs than to begin developing open source tools and manufacturing specifications on the internet? Sure most really nifty techniques are patented or secret or hard to track down physically. But many of these technologists playaround on the internet. If a new development is published on the internet it becomes preexisting art and is theoretically nonpatentable. Besides, whose patent courts shall we be using? Consider the open source car initiative in Germany (I am uncertain this is a real project, I have found only a few english articles and references regarding it via google seaches). Someone claims that automobile developers and enthusiasts are designing an open source specification for a leading edge car. See, I think they have a link or run a google on open source car. Should this obviously fanciful effort succeed, perhaps a mars rover or an "ultralight" spaceplane could be next. Actually I think there are already open projects in progress regarding mars rovers, robots, and actual "ultralight"'s, unless they have faded recently.

I sense some of you readers are still unconvinced. Consider that ten to twenty years ago supercomputers were considered critical to U.S. national security and export was strictly limited and regulated. AFAIK it is still regulated but the big name manufacturers of dedicated supercomputers are no more (I think, could be wrong here). Meanwhile several open source projects are available online that allow supercomputers of various architectures to be built using obsolete (and modern, if you can afford regular upgrades of hardware) personal computers. Is seti@home really looking for alien civilizations or is it the computational engine the U.S. DOD is using to conduct hypersonic simulations necessary to engineer a scramjet for next generation military spaceplanes or hypersonic cruise missiles? Can you prove it one way or another? Is is possible that Area 51 (or other secret design facility) has secure (really secret and encrypted) communications to Berkeley via the internet backbone?

Should the open source paradigm be applied to machine tools how long would it take an emerging third world nation to surpass the manufacturing capability of the U.S. or Europe? Perhaps not long considering that much of our manufacturing has been migrating to low labor cost and unregulated environments anyway.

What does this have to do with grumpy infotechgeeks belittling "idiot" users? When my open source car/robot (add wings, control surfaces, and shuffle bits and pieces and you potentially have a private plane! Stack four and maybe a final spaceplane stage can get to orbit, repeatedly, relatively cheaply! ) running robust open source control software has a pesky feul cell problem or I need a fix for an incorrect tolerancing specification that I discovered but am incompetent to correct and a grumpy mechanical or chemical engineer or tech tells me I am a rude grumpy infotechgeek with no respect for my betters and to fix it myself or cough up some funds to have it fixed fast; I wish to be able to provide them with this link to help prove that I have been attempting to improve my personal attitude and even influence others in the community; and that I really really appreciate their time and effort expended in designing and publishing open source specifications for cars, planes, and spaceplanes; and I know that I should be able to get the seven axis automated machining center I recently completed from specifications available on the internet to apply the patch files automatically (most of the time) and I know that it is still my responsibility to personally monitor and interrupt the process to apply a couple of critical patches manually if I expect my bearing races to last past initial breakin; but if they could just help me this one time I could complete this open car/plane/spaceplane project I have been working diligently on for the past four years and even drive/fly it around the neighborhood from time to time! ... and I promise never ever again to flame them as ignorant idiot computer users when they are having problems with installing a new supercomptuer kernal or patches, and on the verge of success ....

Some passing grumpy infotechgeek will chime in to flame my potential saviors as ingrates who could not possibly have accomplished anything meaningful without open (no, FREE!, as in speech!) source software and especially (insert pet projects here) and despite my precognitive abilities, my potential saviors will evaporate in a huff, I will have to pound my head on the problem until it automagically goes away, wait patiently for a new potential savior to arrive, or unproductively abandon the project unfinished and rejoin the real economy.

Then a usually wise friend will advise me that next time I should: "Screw the community mailing list or bulletin board, stop wasting time with open source, stick to accumulating sufficient funds to purchase reliable commercial offerings or at least some rational private time with key developers who have applicable expertise."

I will reply: "The bulletin board, mailing list, or wiki is the key to developing a common knowledgebase that can be propagated throughout the community so that we thrive and grow and our capabilities continue to increase as individuals and as a community."

He replies: "Some prices are too high to pay. The stress and aggravation is not worth it. The boss said he had money to buy a spaceplane and he has cargo that he needs to get to orbit right away, other projects are depending on that cargo." (I slipped here from my personal spaceplane to a spaceplane useful to business, unconscionsably seeking funding I guess. Titanium is not cheap. However the implication is the same, the use of a private spaceplane for the rest of my life is just as potentially valuable to me as a commercial workhorse is to the investors and the management.)

In conclusion, is it my imagination or has activity on Advogato recently taken a downtown? Perhaps it is the summer weather in the northern hemisphere. Perhaps the better grade of developers and newbies (like me) get tired of the flamewars and ideologies. Still valuable to check in occasionally however. Thanks to nymia and shlomif for pointing me to new tools applicable to online collaboration with animation projects. First you see it, then you build it. Of course, unlike software, hardware and non electric materials cost money so perhaps we must see it, make some money off of intermediate products, then build it. Now I have confused myself, electricity costs money and yet open source supercomputing software has somehow been developed on and for electric machinery.

I go now to fix my foolish error that misposted a submittal regarding collaborative multimedia under a call to political arms thread. Actually I think it was a software bug or feature that did this, not user idiocy (my dialer timed out and hung up and apparently the reconnection sent the post to the top article instead of the correct article, or perhaps I actually made a mistake) but even if it was user idiocy that cause the initial problem, it is a feature of the site software that to partially correct the error, I can post a copy to the correct thread but not delete the orginal error, I think. If I am wrong and new features have been added since I have been here much, then perhaps I have gone far towards proving the anti user "non unix users are idiots" theme presented above. In my defense all I can say is that when I was studying engineering formally and later learning software development management on the job, unix was reserved exclusively for serious applications such as computer science or data processing. All engineering studies were performed by calculator or filling out data processing requests to be submitted to the proper information processing authorities. It is a stellar achievement for the free/open source community that a non computer specialist such as myself can now consider assembling a personal desktop supercomputer myself and that there are almost viable (for idiots like me, not computer savvy open developers like you) technologies suitable for building distributed platforms sufficient to allow extremely large open concurrent engineering projects to proceed; if they can attract sufficient hobbyists with applicable skills. I congratulate you all and urge you to keep up the good work. It is my hope that it will continue to inspire others to similar achievements in the coming decades. It would be extremely helpful if you could all be nice to any computer illiterates who wander by, it is amazing sometimes the expertise in various fields that people develop if they are not wasting their time doing computer science stuff poorly rather than applying their own talents aggressively in their selected fields of interest. A successful open space program will need all of that talent and more to succeed in the long term.

Belated addendum: My plan for a personal and eventually a corporate desktop.

An open source supercomputer assembled by me, for me.

It will be securely attached to a trusted open source distributed engineering platform/environment established and maintained by aerospace hobbyists worldwide to pursue mutual goals and profits.

It will be a few years yet. Much of the above is at the bleeding edge but the critical components all seem to be moving towards user friendly installation and maintenance. Much of the above is government and corporate funded for obvious reasons. Some of it exists but is being actively held back or discouraged by hidden government and corporate agendas. The really cool thing is how development is proceeding around the trouble spots. Illegal to export supercomputers? Open source on top of standard low end hardware components has arisen to substitute. Illegal to export surplus copy of 3DSmax to Australia (an ally of the U.S. last time I checked) to help establish a virtual corporation pursuing space development goals by creating and selling space based animation products, because it uses a form of distributed computing? (Yes, it was, much of the internet bubble bursting was not on Wall Street. It involved small players not getting off the ground who would have provided the big boys with lucrative markets for goods and services.) Now arriving, five or six years later, open source ArtOfIllusion .... it is not currently capable of distributed rendering but there are ideas floating around regarding how to do it with scripts. If this robust, well documented product had been around then we certainly would have organized around it and the ability to setup on thousands of portables freely rather than attempting to organize around four licenses suitable only for expensive high end NT workstations (sunk cost investment, I already owned these licenses and machines) restricted to the continental U.S. It is likely (not guaranteed) that I would currently own part of a successful virtual corporation producing animation products now and be in a position where purchasing expertise and even subcontracted services from the larger ArtOfIllusion community; if ArtOfIllusion had been available. I make this statement because ArtOfIllusion, although "open source", installed last night like a shrink wrap program and comes with a readable downloadable ten mbyte user manual that is helping me get started evaluating it. There are extremely useful tutorials on the home site, regarding solids modeling and animation, not infotechgeekspeak suitable to confuse desperate users attempting to get their production tool working so ..... they can produce something useful. Indeed. The Java based tool does not care if it is on a Windows macine, Linux machine, or high workstation as long at it has an effectively install Java environment. The home site walks the user through installing the necessary Java environment.

To summarize. My plan is to gravitate towards effective developers who are interested in delivering effective tools. Whether that means coming up with cash to purchase said tools first from established proven companies that are stable and likely to be around indefinately or downloading effective tools of sufficient usefulness and robustness that they are likely to attract an adequate user/developer/vender community to thrive into the future. My free advice is this: In neither the shrink wrap case nor the open source case does this tend towards crap produced by grumpy infotechgeeks spouting arcane obscure obsolete unix paradigms. If you cannot speak to a non computer domain specialist (a computer illiterate idiot user who has spent twenty years learning biology, farming, orbital dynamics, etc.) usefully and productively then you better get awfully damn good at computer science or find something else somebody will pay for, since you cannot interact with non computer specialists you will need to greatly impress the computer savvy crowd, and they can be hard to impress sometimes.

End rant. Yes it could have organized and written better. Still a decent editor could do something with it. Fix a few problems and make a few suggestions. I could then revise it. A wiki makes this possible. It also often can degenerate into a messy flame war. It might be too bad when most open communities on the web use some kind of Advogato trust metric to lockdown their wiki access. OTOH it might be quite refreshing to get sorted out into communities of mostly compatible people. Perhaps one could frequent productive communities for profit and visit flamefests occasionally for the adrenalin rush. Just another stupid idea from a non unix infotech indoctrinated computer enthusiast viva la ideological illiteracy .... I am ending my rant soon .... really.

I think it is clear I will not be trusting my desktop to any scripts written by anti user indeologues who think I am an illiterate idiot if I can avoid. As I understand it, the open source many eyeballs approach is supposed to help me avoid the buggy crap produced by these kind of anti user technology wonder geeks. Of course if the anti user infotechgeek's wonder products are well tested, bug free, available for free download, and comes with a readable manual or at least semi intelligible man page I will probably use it despite our ideological differences. A happy productive user/consumer is in all things an extreme pragmatist. So I and many economists believe anyway. I will actually resort to extending a compliment on the off chance of stroking an ego appropriately, thus encouraging further development and use of stuff simple enough for me to use effectively. Much better ending!

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Share this page