Older blog entries for MichaelCrawford (starting at number 118)

The Fish Gets it Right

I just thought to try out the Fish while emailing a friend to explain what Free Software is all about. She just got her very first computer, running the very non-Free Windows XP.

If you ask the Fish to translate "free software" from English to Spanish, it correctly gives "software libre". But I thought to see if I could trip it up, so I tried translating "software gratis" from Spanish to English. It said "software free". Translating "software libre" from Spanish to English gave "free software".

I'm too old to have any appreciation for Open Source. I view its founding principles largely as a way to enable business to take advantage of naive, yet hardworking programmers. I was a Free Software advocate while most of you were still crawling around in diapers. Well, actually not that old, but I first learned GNU Emacs - and read its source code - I think in 1988.

For me, Free Software is all about the creation of a wonderful community for the people who write it and who use it. I don't think IBM invested a billion dollars in Linux because they cared a whit about community values. One doesn't invest that kind of money without making a coldhearted calculating business decision.

So in my email to Yvonne, I went on at some length to explain what Free Software was all about, but I made no mention of Open Source.

One thing I do have to credit the Open Source movement for is that it has been much more effective at publicity. I doubt that Free Software could ever have been explained in terms that would have excited Wall Street the way the Red Hat and VA Linux IPOs did.

However, I feel the real value of the Open Source movement is that it got a lot of people to learn more about the reasons the GNU General Public License says what it does than might otherwise have been the case, so that despite the fact that RMS complains endlessly about Eric Raymond and Linus Torvalds stealing his thunder, I think the Free Software movement has many more real supporters now that would have been the case if the Open Source movement hadn't happened.

iRATE

I sent this email to Yvonne to explain to her that I wanted to install a particular piece of Free Software called iRATE on her computer, so that she could enjoy easy-to-use downloads of music files, without violating anyone's copyright. From iRATE's page:

iRATE radio is a collaborative filtering client/server mp3 player/downloader. The iRATE server has a large database of music. You rate the tracks and it uses your ratings and other peoples to guess what you'll like. The tracks are downloaded from websites which allow free downloads of their music.

That "large database" has records of 46,000 music files - all of them legal to download.

iRATE is a Java program - it is known to work well on Linux and Windows, and there is a report it works on Mac OS X. I'm going to give it more thorough testing on OS X myself when I get some free time.

It is still a very new program, and although it works well there are some glitches. But I can see that iRATE has the promise of being the killer application that puts the RIAA out of business.

Why would I say that? Because there is so much high-quality music available for free, that's legal to download, that I think most people on the planet could get all the music they ever wanted to listen to off the Net - and they wouldn't be breaking the law.

That would not only devastate the major record labels' sales, but the RIAA would have no grounds to complain about it - the music is all legal to download. The people who would profit would be the indie musicians, who would get more gigs and sell more CDs directly to fans, without the labels getting their greedy cut.

Let me make this clear: the way to get rid the RIAA's threats of lawsuits, and the government's threats of prosecuting the music downloaders, is to get everyone downloading legal music instead. But the RIAA doesn't want you to know that because they will make even less money if people download the legal music, because the RIAA makes no money from the indie musicians. I'm sure the RIAA is well aware that some downloaders do end up buying CDs after downloading a song or two, so despite their bitching they do still make their money.

That's much the same reason that Microsoft views Free Software as a much greater threat than software piracy. Most software pirates are as locked into Windows as legitimate users, and through lawsuits and legislation, Microsoft knows they can eventually the pirates to have to pay for software, software that will be published by Microsoft. That's not the case with Free Software.

The music has been out there for years but it has always been difficult to find it and much of what is available is just not very good. There is the same problem with writing on the web - the record labels do provide a useful function, just as editors do in journalism, by selecting the content that's worth paying attention to.

The solution for web publishing is of course moderation. The articles published at Kuro5hin are mostly well-written. But if you're not a K5 member you'll have no idea of the collosal drivel that's often submitted to the moderation queue there. K5's moderation is done by its members, who take the trouble to moderate well because they care about the site. Similarly the articles at Advogato are mostly well-done because you have to be certified to publish them here. They may not be as well-written as Kuro5hin's, but are almost always well-informed technically, which is appropriate to the purpose of Advogato.

iRATE solves both these problems - finding the content and selecting the good stuff. You only need to access iRATE's server to locate the content anywhere on the web, and iRATE's collaborative filtering system takes care of picking out the kind of music that you're going to like.

I found that iRATE started doing a good job of finding music for me after downloading about a dozen songs. Bonita is much more selective about her music, so she had no success in about a half dozen. I told her that iRATE will probably work, but will take longer to learn her tastes. She's willing to give it a try so she's left it downloading overnight while she's been asleep.

Do You Download Music? Are You an Indie Musician? Concerned About the RIAA's Lawsuits?

Then you should read my article If Indie Musicians Wanted Their Music Heard..., now running in the Op-Ed section at Kuro5hin.

It's about legal music downloading. Many musicians offer legal downloads, so their music will be more widely heard. Try downloading my piano compositions, for example.

If everyone would download the legal music that tens of thousands of artists freely offer, the RIAA would soon dry up and blow away, because they'd have no one to sue anymore, and they'd go broke because no one would be buying Britnney CDs anymore.

However, as I explain in the article, the indie musicians themselves create obstacles to their own success by making many of their websites unusable. I give some constructive suggestions.

Many of the responses to the article give links to better music download sites than the ones I initially tried - Bonita is having great success downloading music now. So in the end, it turns out that the well-informed music fan can find lots of easily downloadable music. But that means that the artists with bad websites lose out even more.

16 Jul 2003 (updated 17 Jul 2003 at 00:33 UTC) »
New K5 Article

I have a new article in the moderation queue at kuro5hin: If Indie Musicians Wanted Their Music Heard.... It discusses how problems with website usability interferes with the attempts of unsigned musicians to promote themselves by giving away free music downloads. I offer a number of constructive suggestions.

While the article is still in moderation, the above link will only work if you're a logged in kuro5hin member. If it's approved by the moderators, then it will work for everyone.

If you think my article might be worth while, please go read it and vote.

Fighting Age Discrimination and Buzzword Bingo in the Software Industry

Have you been working as a programmer for more than ten years? Have you noticed that the most experience companies want when hiring a Senior Software Engineer is seven years, with three being more common? I have fifteen years paid experience as a programmer. Have you ever seen a job ad seeking a programmer with fifteen years experience? I did - once.

Have you ever lost out in the competition for a position to some kid who Learned Java in 21 Days? I have.

Today a potential client wrote to me to ask if I had the experience in the specific skills needed to develop his product. And the fact is, I don't. But I wrote a reply that convinced him to consider me anyway. While I am often passed up for jobs because I don't have some buzzword on my resume, I am sometimes able to convince clients (and employers, before I started consulting) that it doesn't matter that I don't have the skills they're looking for.

I have an interview scheduled for tomorrow afternoon.

You can read my reply in Do I Have the Skills to Write Your Software?.

I won't post the text of my reply here, because it is a blatant plug for my consulting business. But the page also has the following, which I'd like everyone to read and think about:

Who Should You Hire?

I feel that these days, far too great an emphasis is placed on hiring candidates that know the specific APIs or tools used in the work. This comes at the cost of not hiring candidates who are actually competent programmers.

This leads to buggy and slow code, delays in product development, trouble when unexpected problems arise, and an inability to adapt when the skills needed to develop your software inevitably change over time.

It also leads to the absurd situation in which experienced, skilled and productive developers often cannot find work, because inexperienced and unskilled programmers are hired because they are fortunate enough to be able to list a particular buzzword on their resume. They get those buzzwords because they focussed on whatever was hot when they first learned to program. Despite the fact that experienced programmers can learn new APIs quickly, they are given no credit for the years of work they have spent writing other sorts of software.

Whether or not you choose to hire me to develop your software, I respectfully request that you base your decision to hire any programmer, not just me, on your impression of their competence to develop any sort of software. You will be far better off for doing so, and your company will prosper as a result. If everyone hired programmers this way, we would all be better off because our industry would be healthier. Even the inexperienced programmers would benefit, as they would learn to place greater emphasis on deeper comprehension, and ultimately become flexible enough to do anything themselves.

In job interviews, try asking candidates how they'd approach specific programming problems that are quite outside their area of expertise. Many programmers will be completely snowed by such questions - but some won't be, they may not give the correct answer but you will be able to tell by the way they explore the problem that they have a clue. These programmers are the ones you want to have working for you.

Knowing how to play a few specific tunes doesn't make you a musician. Knowing your chops does.

Thank you for your attention.

My message from this page, that hiring needs to be based on basic competence and understanding of the fundamentals, is I feel a message that needs to be heard by everyone who hires programmers. It's not just that programmers are sufferring by not being able to get jobs for which they really are qualified - it's that businesses are sufferring because they hire incompetent programmers as a result of inappropriate priorities.

If you agree with me, write your own version of "Who Should You Hire?" in your own words and post it on your own website. If your company is small, email your statement to your whole company. If it's big, email it to the HR department and all the managers you know.

The focus on buzzwords in resumes has become a serious problem for the industry. Many companies have large databases of resumes, and select candidates for further examination based on simple text searches for keywords like "Visual C++", "SQL", "Linux" or "J2EE". Highly qualified people who do not have the exact keyword they're grepping for never have the good fortune to have their resumes read by a live human being. And this I feel is a damn tragedy.

Do I Have the Skills to Write Your Software? is now linked from a couple of places in my resume.

One of the primary reasons I changed fields from graphical user interface application development to embedded systems programming is that embedded is one of the few areas of software where experience still seems to be truly valued. I still take a great deal of comfort from the fact that when I interviewed at embedded hardware manufacturer Sky Computers, they told me that the youngest programmer on the team had twenty years experience.

Debian

I apologize if my rant yesterday made it sound like I was accusing the many people who work hard to create Debian do a bad job. In many ways, which are also obvious to anyone who uses it, I feel Debian is the best distribution. But I think there is room for improvement. I think the problem of being unable to get up-to-date software built with stable's dependencies is a serious problem for many users, and is one reason why many users might reasonably pick a different distribution.

The installer is often given as Debian's weakest point - it can install on any sort of box, no matter how funky, from any media, but it is quite difficult to learn to operate. But I don't feel that's its weakest point. The installer can be figured out with practice and patience. You can ask an expert to install for you. A screwdriver shop or manufacturer can pre-install. Alternative installers, which may be less flexible but are easier to use for common systems, can be written and are in fact under development.

bcully, yes, pinning is often a solution, and I have used it, but it tends not to work so well for the larger and more complex packages. I should have at least mentioned it though. The bigger a package is, the more dependencies it is likely to have, and the dependencies will have dependencies and so on. Having lots of dependencies will mean more and more software from testing or unstable will be installed, making it more likely to screw up the machine.

What I'd really like is Mozilla 1.4 with whose dependencies alll come from Woody. You can get that if you build the Mozilla 1.4 source on Woody. People have already made such builds available for another platform, and unless someone does it before I get around to it, I eventually will build it for PowerPC. Actually having to use Mozilla 1.0 is my #1 complaint about Woody - I can live with just about all the other software on my system without upgrading. So I guess it will be worth my while to put out at least that much effort.

Pinning works best for smaller packages, or at least packages that are self-contained, so they don't have many dependencies.

Pinning is a mostly acceptable solution to the problem, but not the best one: using pinning exposes you to the same problem as just running testing or unstable outright. Eventually a developer will submit a package that breaks your application, you'll do an "apt-get update", and your box will break. It's likely to be a smaller problem with pinning than with running the whole distribution, but it still has the result that at any time you could lose functionality.

After all, if they weren't in constant flux, testing and unstable would be stable. There needs to be a way for production users to get at least some updates without "participating" in the development process. I think you can keep the vast majority of users happy by providing backports for just a few of the most commonly used packages.

Allow Me to Repeat the Same Complaint Everyone Makes About Debian

I've been using Debian PowerPC Woody on my Macintosh 8500 for a while now. I use it for my masquerading gateway, and usually do my web browsing and email there. For the most part it works OK. I like how easy it is to administrate a debian system. I like being able to install new software (with dependency management) and keeping up-to-date with security patches with apt. I especially like the wonderful community of users who are on the Debian mailing lists.

But there's one thing I can't stand about Debian. One very important thing. It just bit me in the ass:

I cannot get bug fixes to my programs, even if the bugs have already been fixed by the upstream developers.

Yes, Debian is very conscientious about staying up-to-date with security advisories. But the way they do this is by backporting security fixes from the upstream into the increasingly old and just as buggy old versions of the software that's running in stable.

I don't run any servers in my little office network that are exposed to the Internet, so as long as my basic firewall is working OK, I'm not too concerned about security. What I'd really like is to get a current version of Mozilla. Bugs in Mozilla effect me every day, sometimes dozens of times a day.

I'm running Mozilla 1.4 on my Windows box. I'm running 1.2 on my OS X Macintoshes, because I haven't got around to installing 1.4 yet (I will soon). They all work great. But I'm running 1.0 on my Debian PowerPC Macintosh. It's the web browser I use the most, because it's the one box that is turned on all the time and always runs the same OS. Most of my other boxes I switch OSes all the time, so I don't have my bookmarks or record of sent mail at hand.

I would love to install 1.4 on my Debian Mac, but I can't, because it's not available for PowerPC stable.

Yes, I could run unstable on my Mac, and I was doing that for a few months before Woody became stable, but now that I have a stable system I want it to be stable. The debian developers feel pretty free to make incompatible changes to unstable, because it's there for developer and testing use, and these changes need to be made, so that some day, several years down the road, unstable can become stable with what, for a few happy weeks, will be all up-to-date software.

When I first got the stable woody installed, I thought it was the best thing since sliced bread.

What happened to me today that made me gripe about Debian? I wrote an email addressed to a whole bunch of people. I had the email all written. I had all the addresses entered. It was something important I wanted to tell a bunch of people, but some of these people don't like each other so I decided I should BCC each of them so they couldn't start flaming each other.

So one by one I changed the To: popups to BCC:. And Mozilla suddenly quit, losing the letter that I'd typed. I've experienced this before. What I've learned to do is, if I'm going to BCC a lot of people, to write my letter on Mac OS X or Windows, where I have a newer browser. But I forgot.

There are some bugs I experience several times a day on 1.0, which don't happen on either 1.2 or 1.4 on my other platforms:

  • After typing into an email for a little while, the window becomes unresponsive. A workaround is that I can save it as a draft, open the draft and continue. This happens for almost every email I type on my debian Mac.
  • A browser window becomes unresponsive, and stops doing screen updates. If I'm scrolling while this happens, it continues to scroll, and I can't stop it. I have to close the window. Curiously, if this happens in a tab, I can close the tab and the other ones still work OK
  • Double-clicking the text in the URL entry box doesn't select all the text in the box. It does at first, but then the text to the left of the cursor gets selected while the text to the right is not. This makes it hard to either delete or copy the URL that's in the box
  • Opening the mail and newsgroups window makes the current browser window go to Find.com, then after the mail window first appears, the Find.com page is brought forward. I have to then click to get my mail window frontmost again. The Find.com people must think the Debian folks are a great bunch of friends to be sending them all this traffic

There are lots of other little bugs, but those are the ones that bother me the most - dozens of times a day.

Some people have tried to solve this problem. Somewhere I found a page of backports of the most current versions of many popular programs to Woody - but only for x86. I need them for PowerPC.

Another option that any developer can do would be to package their software for more distributions. Sourceforge and other facilities provide machines for building so you can make packages for distributions and architectures you don't run. Just building for all the architectures and package formats isn't the problem - testing the software is. I'm sure the Mozilla project wouldn't find it too difficult to add Debian PowerPC to their build, but there is also YellowDog and LinuxPPC. Debian supports eleven architectures, and some of the other distros have multiple architectures too. Even large projects like Mozilla have to draw the line somewhere on what architectures they'll provide binaries for.

(Developers could make things a lot easier for many users by at least providing build targets in their source code that will build all the major package types. There are lots of packages available for Debian, so I only rarely have to compile from source, but there aren't nearly so many available for Slackware, which I also use, and I hardly ever find either Slack packages or tarballs whose builds will make one.)

Now, I know what you're saying: "Use the Source, Luke". And that's probably what I will do, when I can finally find the time to do it. With almost any other package except Mozilla and GnuCash, I would just get the source out of unstable, and compile it on my Woody system, and then I'd get current software with Woody dependencies.

The problem is that my life has been very hectic for a long time, and doesn't show much prospect of getting any easier anytime soon. Building Mozilla myself is a daunting task. I actually did it once on my Slackware laptop, when I was thinking of helping to develop Mozilla. It requires a gigabyte of disk space to hold all the object files - I had to run all over my filesystem while the build was going on, frantically deleting stuff so the build could complete. I did get a working Mozilla out of it, but it's not something I would care to do again.

And I can't do it on my Mac, because it's an old machine from 1996, and only has 2 GB on its Linux hard drive. My /home and /var partitions have only about 20 MB of free disk space apiece. /usr only has about 2 MB - for quite some time now, if I wanted to install new software, I had to take something else off.

What I did when I wanted to compile a new kernel was mount an NFS filesystem from my Slackware x86 box, where I do have lots of space. And when the time comes, that's what I'll do, and if I get a good build I'll post the package on my website so others can benefit. I'm just not looking forward to it, and I don't know when I can find the time.

I can understand why Debian's policy got to be the way it is. Once a stable system has gone through all its integration and testing, you don't want to change too much or you risk opening up all kinds of problems. The thing is, there are bugs in software, that don't impact security, that are as important to fix as the security holes that Debian is very attentive to.

What I'd like to suggest is that Debian users get to vote on packages that they'd like to see updated, maybe provided in a separate backport archive, not part of the stable distribution. By keeping just a couple dozen software packages up-to-date and built with stable's dependencies so you don't have to break your box by installing unstable, the vast majority of Debian users could be kept much happier.

I've been running Slackware for many years, I think since about 3.0. I like slackware a lot, but Debian has the advantage of the better packaging system. One thing Debian has over Slackware is that the user community is friendly and helpful, unlike all those cranky old bastards on the Slackware newsgroup. But at least with Slackware I can get regular updates to my software. That's one of the reasons I still run it on both my x86 boxes.

23 Jun 2003 (updated 23 Jun 2003 at 06:28 UTC) »
MacHack

MacHack was great. The keynote speech by Ken Arnold was one of the best conference presentations I've heard ever. He had some remarkably insightful things to say about software design.

At last I no longer feel guilty that I don't use the Uniform Modeling Language to design my software. I actually learned it once, and tried to design a large program with it, but gave up, feeling that it just wasn't for me. But between then and Arnold's keynote, I always felt I was doing something wrong. He spoke about software design the way people really do it, and I think the way people really should do it, rather than the way Ivory Tower academics say you should, but nobody really does. He also didn't have anything particularly nice to say about UML.

He actually gave two presentations during the one session, the other was an introduction to Jini, which I'd heard of, but was unfamiliar with. I now think it's the greatest thing since sliced bread. For quite some time I haven't been very impressed with Java, and haven't really wanted to do any, but since hearing about Jini, I think I'll pick it up again.

What Jini does is solve the problem of embedding protocol implementations at the edges of the network - in end-user computers. A lot of really complex software is required to implement a wire protocol - the bitstream that goes over the net. What do you do if you want to change the protocol? If you have Linux or BSD, you're lucky, you can install a kernel patch. If you have Windows, you can wait for Microsoft to issue a service pack - or major new version. But what if you have an embedded device whose firmware can't be updated? Throw it away and get a new one! Even if you could update the protocol in end-user machines, there are just too many to make it practical to do so.

What Jini does is implement protocols in Java bytecodes that are kept on a server. The client is written to an API, but the implementation of the API is not present until you try to connect. Then you download it from the server. If you want to talk IPv4 one day and IPv6 the next, you just get some new bytecodes - it's totally invisible to the client.

My talk on OS X device drivers mostly went well and had a good turnout. It's a complex topic though, and I both didn't get to cover everything I wanted to, and ran overtime, so I got booted out of the room when the next speaker showed up. Maybe I should have focused on a specific aspect of writing device drivers - I tried to cover the basics from beginning to end, and that's too much to do in an hour. I'm going to start writing an article about it soon. I still feel that I'd like to address the problem that some parts of the process of writing an IOKit driver are poorly documented. There is some great documentation and sample code, but it is incomplete.

Heh. I forgot to plug in my iBook when I set up for my presentation. It went to sleep halfway through my talk, so my slides disappeared from the projector screen and I had to scramble to plug it in. My battery is giving out I think. I've had it a year now.

The best part of it was seeing some old friends. There were a number of people I used to work with at Apple, and some people I knew from other companies back in the days. I haven't gone to many conferences for a long time, I really should go to more. I've been really isolated from the developer community, living out in the Maine woods for so long.

I have lots of friends and colleagues online, but now I finally believe my wife when she tells me that knowing people online isn't the same as meeting them face-to-face.

I'm really tired though. MacHack is really exhausting - no one ever gets enough sleep, and I seemed to have a particularly hard time. As soon as I got home I went to bed, and only got up at 11 pm.

One sad note. It turns out I just attended the last MacHack, at least the last conference called by that name. A conference is still planned for next year. The name has apparently fallen victim to post-9/11 hysteria. Apparently it is difficult for some people to get their employers to pay to send them to a hacker's conference, where they participate in a Best Hack contest.

Next year they plan to call it Advanced Developer's Hands On Conference, or ADHOC, and I think plan to broaden the topics from Mac OS (that's actually been done for some time). I guess they have to, but it makes me sad - MacHack's been going on for 18 years. There was a 17 year old girl there, I think the daughter of one of the organizers, who has attended each year of her life, starting as an infant.

I wasn't going to enter the hack contest, but in the end I showed off the sample code I wrote to illustrate my talk so everyone could see it. FWDemo works around the exclusive hardware access mechanism of OS X to do some SCSI I/O via SBP2 to a mounted FireWire hard drive. It's not really a hack, but taking advantage of documented features in a straightforward way. Unfortunately I didn't get it together to submit it to the conference CD, so after I rest up some more I will make it available from this page.

I think I probably had the least exciting hack there. I'd be astounded if it got any votes. But the winner (and the one I'd have voted for, if I'd been awake to vote) was a real thing of beauty. The theme of the conference was "Unstoppable", in response to Apple scheduling the WorldWide Developers Conference for this week.

"Unstoppable Progress Bar" prevented those lovely Aqua progress bars from ever completing. Instead, when progress was about to complete, a spout of water burst out of the end of the progress bar and filled up the progress dialog window with undulating waves. Now that's what I call a hack, a really beautiful hack, not what those 31337 k1dz do with their portscanner scripts.

17 Jun 2003 (updated 17 Jun 2003 at 07:50 UTC) »
chaoticset, I'm sure there is an algebraic solution to your problem. The computation required should be very modest, of the order of a multiplication and addition for each of your point coordinates.

What you are looking for is similar to a least-squares fit, but where the function is a vertical line. When fitting points to a function, what you do is define an error function and (conceptually) graph the error function as you vary all the functions parameters, then locate the minimum of the function.

One property of function minima is that the curve is horizontal, because the error dips down, turns around and goes back up. So the derivative of the error function is zero.

You might have a maximum there though. One way to distinguish is to look at the second derivative, which will be either positive or (for a flat line) zero.

So your error function might be the difference in areas between the points on either side and the line. I think you could use the square of that actually to be sure you're finding a minimum. Differentiate the error function, set it equal to 0, and solve for the parameter you're looking for, in this case the X intersection.

It happens that the one and only time I've ever used my education in part of my work was to derive the function for fitting a plane to a set of points in 3-space. It's just like doing a line fit, but a little more complicated because of the extra coordinate. I did it for an aerial photography image processing job I had, where we had to correct for the different angles of sun shade that crops would have across the wide field of view of our camera.

Mac OS X

Well I found the solution to my OS X kernel extension bug, that was causing a panic.

I had suspected that it was a bug in OS X itself, rather than my code, because my code was so very simple, and so little of it had run at the point the panic would happen.

I learned all about 2-machine kernel debugging with GDB, and was happy that Apple has open sourced most of the Darwin kernel, and after a lot of head scratching determined that the problem lay in an incorrect configuration file that goes with my driver.

Native Mac OS X software comes in "bundles", small directory heirarchies. For a driver called "MyDriver.kext", the executable will be in a file called MyDriver.kext/Contents/MacOS/MyDriver.

There is an XML file in there called Info.plist that declares a lot of important info about your driver - what C++ class to use, what libraries or other kernel extensions it depends on, it's version, and (if it can be depended on by something else) what the version of its binary interface is. If you update the driver without changing its ABI, then the two versions will be different.

The problem I had was that the contents of an Info.plist file for a kernel extension aren't terribly well documented, so I was basically guessing at it by looking at some others for examples, and I got it wrong.

That's not suprising, what was suprising is the effect it had - the usual result of an incorrect Info.plist file is that your driver just won't load, or something that depends on it won't load. But I had a kernel panic that I could reproduce just by plugging in a firewire drive a couple of times.

It turned out I was confusing OS X' kernel so badly that it was instantiating a different C++ class at some point than what it was really trying to. The kernel can instantiate classes by name, but my Info.plist screwed it up.

Then trying to use the bogus object would call a method that wasn't what the calling code thought it was. That particular method was a reserved method set aside for future expansion, to work around the fragile base class problem of C++. The implementation of all the reserved methods is to call panic().

It was a real pain to debug but I learned a lot. I also understand better some of the things I had been planning to talk about in my MacHack talk later this week. My presentation is going to be a lot better because of it (although I've been spending time debugging that I had planned to use to write my presentation!)

A widespread problem with programming Mac OS X is that the developer documentation has a lot of holes in it. The documentation that's been written so far is of excellent quality - Apple writes some of the best documentation in the industry. But it is incomplete. I hear complaints of that all the time, not just regarding driver documentation. I guess it's because OS X is still so new.

But I think that if Apple increased its investment in writing doc and sample code, it would be rewarded disporportionately in terms of Macintosh sales resulting from people buying them to run all the great new software that would get written by clueful developers. I'd even be willing to say that it would be a better use for Apple's limited resources than some of its own development - Mac OS X as it stands today works great, in fact it does far more than Apple's developers know how to take advantage of.

What sells Macintoshes is not features in the OS, it's applications that run on it that users want to use.

Mike's Great Idea

It just occurred to me that a supremely useful thing to build into any microprocessor would be a circular buffer of the last few program counter addresses executed by the processor.

That way when you jump into space and crash, you'll know where you jumped from.

This sort of thing is traditionally done with an in-circuit emulator, but ICE's have always been expensive and are difficult to make, and not always even available, for today's desktop and server processors.

With all the zillions of transistors in a Pentium 4 or PowerPC 970, would it be so hard to include storage for, say, the last 64 program counter values?

If this turns out to be some sort of novel invention, I hereby release it to the public domain. I'd just like to have this for the bug I'm working on now.

Mac OS X

I'm developing a very simple kernel extension (device driver) for Mac OS X.

And it seems my kernel extension will cause a 100% repeatable kernel panic on my iBook, while it will work correctly 100% of the time on my blue and white G3. I discovered it late in the game because it starts out to work OK, but breaks upon further testing that I only tried after I had most of it working.

On the iBook it breaks in OS X 10.1.4 and 10.1.5. On the G3 it works in 10.1.3 and 10.1.5. At first I thought it was the different system versions so I finally installed the 10.1.5 update on both machines.

I am suspecting I've discovered a kernel bug that lies in some code that's different between the two machines. I'm programming firewire - the G3 has a PCILynx firewire link chip, while the iBook has OHCI. Maybe the OHCI driver is broken.

Of course it could be a bug in my code, but my driver is actually so simple at this point that I don't see how it could be.

So far I've just used IOLog to print diagnostic messages to do my debugging, but now I'm about to get set up to do two machine debugging with GDB via ethernet. Actually the reason I started to download 10.1.5 is because Apple's version of GDB requires both Macs to be running the same OS X version.

Unfortunately Apple's instructions for two machine debugging have been updated for 10.2, and won't work as described on 10.1.5, so I have some figuring out to do. One can do it, I just don't have the right instructions.

If you're wondering why I'm running such an old system when Jaguar is so much better, it's because I want my products to support users who haven't upgraded yet. You have to build drivers on the oldest system you want them to run on; a driver built on 10.2 would be binary incompatible with a 10.1 system, while a driver build on 10.1 is supposed to run on 10.2.

So I've been hearing all this great stuff about Jaguar for eons, and Panther (10.3) is about to be anounced, while I've been running 10.1 the whole time I've owned both these macs.

109 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!