Older blog entries for crhodes (starting at number 155)

As I said in my last entry, I was in Amsterdam for ECLM 2011, once again smoothly organized by Edi Weitz and Arthur Lemmens, but this time under the aegis of the Stichting Common Lisp Foundation (of which more a bit later). After leaving the comfortable café, where Luke and Tobias (along with a backpack's worth of computing equipment on its way to visit St Petersburg) eventually turned up, it was time to go for the Saturday evening dinner, held at Brasserie Harkema. In the olden days, when I had time to do a certain amount of public-facing Lisp development, I got used to receiving the adulation of a grateful public – this time, at the dinner, I happened to sit next to someone called Lars from Netfonds. “Hmm,” said something at the back of my mind, ”that rings a bell.” Lars who? Lars Magne Ingebrigtsen. My inner fanboy went a bit squeee – even to the point of explaining what gmane was to a third party in his presence. Still, it was nice to be able to say a heartfelt “thank you” in person to someone whose software has saved me time and a certain amount of embarrassment. Other topics of conversation at the dinner included a discussion with R. Matthew Emerson (of Clozure) about the social aspects of Free Lisp development, a topic on which I have written before; contrasting the attitudes and experiences of contributors and users (small and large) of Clozure CL and SBCL was interesting. It was also nice to be able to talk about Lisp-based music analysis, synthesis and generation programs; reminding myself that I do still know about that landscape enough to fill people in.

The meeting itself, as others have observed over the years, is only partly about the talks: a substantial part of the goodness is in the chats over coffee and lunch. Edi and I reminisced about meeting in the venue, Hotel Arena, at a precursor to ECLM (in autumn 2004, I think... I certainly remember being approximately penniless, just after starting my first job); other people present then (as well as Arthur) included Nick Levine, Luke Gorrie, Peter van Eynde, Jim Newton, Pascal Costanza, Marc Battyani, Nicholas Neuss... many of whom were around for the rematch; a total of 95 people registered for the meeting, and the hall (part disco, part church) for the talks felt pleasantly full.

Of the talks, I was most interested in the material of Jack Harper's talk, concerning some of the constraints involved in building a product for (human) fingerprinting, and asserting that using Lisp in this product was not a problem. (Favourite quote: “batteries are complicated things”). I was a little bit disappointed that few of the speakers actually interacted with any code at all (Luke may claim that writing his slides in Squeak Smalltalk counts, but I beg to differ); in fact, Paul Miller of Xanalys was the only one of the speakers spending substantial time demonstrating anything related to the subject of the talk – and that only because the canned demo movie refused to display on the projector. Luke's talk appeared to go down well; the obvious first question came and went, and there were some more interesting questions from the floor. Star of the show was Zach Beane's talk about quicklisp; I spend a lot of time presenting or watching presentations in each of my capacities, and it's nice to have a refreshingly different (and deadpan) delivery, with good use of slides to complement the spoken content. I hope that he's right that his personal scalability will not be taxed, and that volunteers will find ways to assist in the project by taking ownership of particular tasks.

While Hans Hübner may have attempted to be controversial in his opinion slot about style guides for CL, the real controversy for me was Dave Cooper's announcement of the Stichting Common Lisp Foundation. Now, the Foundation has clearly done one thing that is helpful: provided legal and financial infrastructure so that the financial risk of hosting an ECLM is not borne entirely by two individuals; the corporate entity can potentially, after acquiring a buffer, provide the seed funding needed and, if necessary, absorb small ECLM losses (not that I believe there has been one, but hypothetically) through other fund-raising activities. On the other hand, when I asked the question as to how the Stichting CL Foundation would aim to distinguish itself from the ALU, the response from Dave Cooper was that the only difference would be that the foundation would focus on CL, where the ALU's remit extends to all members of the Lisp family. Such a narrowing of focus is, I think, potentially beneficial – indeed, when going through my email archives to look for the date of the 2004 meeting, I found a lucid rationale from Dan Barlow explaining that he had chosen to make CLiki's focus specifically DFSG-free Unix Lisp software in order to promote a sense of cohesion (rather than being motivated primarily by a strongly-held belief about the inherent superiority of DFSG-licensed software). But I don't think that the ALU's only weakness is that it spreads its Lisp net too wide: I think it has lost track of what it as an entity wants to do beyond perform a similar function for the ILC as Stichting has performed for the ECLM; Nick Levine, in his talk about how to find Lisp resources, observed that the ALU has a valuable piece of real estate – the lisp.org domain – which does not seem to be used to grow or meet the needs of the Lisp community, whether Common Lisp specifically or Lisp more generally. I found it a little sad that, Edi and Arthur aside, the overlap between the ALU board and Stichting CL Foundation directors is 100%.

After the longer talks came the lighting ones, and I took the opportunity to repeat my talk and demo about swankr, my implementation of the SLIME backend for R, from the European Lisp Symposium in April. Erik Huelsmann announced ABCL 1.0, a far better milestone to announce at the ECLM rather than my sneaky announcement of SBCL 0.9 (six years ago!? Doesn't time fly! Also, what ugly slides...). And after some more lightning (and less-lightning) talks, it was time to wrap up with drinks, dinner, and good conversation.

I'm in Amsterdam for the European Common Lisp Meeting, 2011 vintage. Still wearing my two hats, as academic and
entrepreneur” – and, somewhat to my surprise, still enjoying it. Though I do have a fairly nasty cold, possibly a result of too many late nights (business), early mornings (children), and interaction with disease-ridden individuals (students).

I'm sitting in the cafe de jaren, a haunt which I think is popular with students – but today looks just plain popular. They seem very accomodating, with newspapers to wade through (admittedly, I brought my own), free Wifi, and tasty soup and sandwiches. I've been here before; in fact, getting on for a decade ago, my wife and I mislaid a copy of Asterix and the Somethings (dunno which) in Dutch. It's a pleasure to sit here, waiting for my colleagues to show up so that I can inspect Luke's presentation for blatant falsehoods off-message content. Looking forward to this evening's brasserie outing and of course the talks tomorrow – it'll be particularly interesting to see how Jack Harper's presentation compares with our Teclo experience –
and of course it'll be good to catch up with old friends, some of them in the flesh for the first time...

Hey, what happened to that resolution to blog weekly about being entrepreneurial? Well, it's been a long few months: course mostly delivered; PhD student approximately completed (well done, Ben); plenty of extra time to actually be entrepreneurial. Before I sink back down into the mire of too much to do and not enough time, an update!

I went to the 4th European Lisp Symposium, held at the Technical University of Hanburg-Harburg. It was great. Compared with last year, when I was Programme Chair, and volcano eruptions closed most of European airspace, leading to scrambles to find alternative keynote speakers and general stress about whether there were going to be any attendees at all, this was a breeze. Sure, I participated by reviewing a few contributions, but the event itself snuck up on me – I found myself on the Monday remembering that straight after my teaching duties on Wednesday, I needed to dash to the airport to catch a plane. Very pleasant; thanks to Didier Verna and Ralf Möller for making things so smooth that I could just turn up and assume that the event would be running perfectly – I know how much work it takes to get to that point.

It was good to catch up there with some of the wider Lisp world; there were about 60 attendees, including a solid transatlantic contingent. I couldn't quite allow myself to relax completely, and so ended up giving a lightning talk about R – a useful warmup for my slightly more substantial talk (slides; audio recording appears to have failed) at the Zürich Stuff'n'Lisp User Group. The cuteness of adding two lattice objects together (in graphical presentation form) to get a new graph combining the two originals seems never to get old, though since it's in fact six months old I did take the time this morning to commit and push the accumulated fixes to my public swankr git repository.

Right. Back to work work work fun hacking.

What I got for Christmas: sufficiently advanced Intel graphics drivers for 855GM, in Linux 2.6.37-rc7. No more missing mouse cursor on boot (and, icing on the Christmas cake, working video playback!) Thank you to those who worked on this, particularly since I couldn't actually work out how or where to submit useful bug reports (and so resorted to my usual strategy when dealing with laptop-related issues, which is to contact mjg59 by whatever means available and follow his suggestions as precisely as possible).

2 Dec 2010 (updated 2 Dec 2010 at 17:05 UTC) »

As the train I'm on ambles its unheated way through the unseasonably Wintry English countryside, it's time for another “weekly” exciting entrepreneurial update. Actually I should be properly working, not just talking about working, but there's a file I need for that elsewhere, and three's mobile Internet coverage evaporates about 3 minutes outside Waterloo station – if only there were a company dedicated to bettering mobile data infrastructure... So, here I am, with means, motive and opportunity to write a diary entry.

Since I last wrote, I have fought with R's handling of categorical variables in linear models; the eventual outcome was a score draw. The notion of a contrast is a useful one; very often, when we have a heap of conditions under which we observe some value, what we're interested in is not so much the predicted value given some condition, but the difference between the value under some condition and the value under some other: the canonical example for this is probably the difference between the condition of some group receiving a trial treatment, and the group receiving a control or placebo: the default contrast for unordered categorical variables in R is called the treatment contrast (contr.treatmen t).

In my particular case, I wanted to know the difference between any particular contrast and the average response – none of the categories I had in my system should have been privileged over any of the others, and there wasn't anything like a “control” group, so comparing against the overall average is a reasonable thing to want to do, and indeed it is supported in R through the use of the sum contrast contr.sum. However, this reveals a slight technical problem: the overall average and differences for each categorical variable is one more variable than the (effective) number of values; just as in simultaneous equations, this is a Bad Thing. (Technically, the system becomes undetermined.) So, in solving the system, one of the differences is jettisoned; my problem was that I wanted to visualise that information for all the differences, whether or not the last one was technically redundant – particularly since I wanted to offer a guideline as to which differences were most strongly different from the average, and I would be out of luck if the most unusual one happened to be the one jettisoned. Obviously I could trivially compute the last difference, simply from the constraint that all the differences must sum to zero (and actually dummy. coef does that for me); but what about its standard error?

Enter se.co ntrast. This operator allows the user to construct an arbitrary contrast, expressed most simply as a vector of contributions to that contrast and ask an aov object for the standard error of that contrast. Some experimentation later, for a linear model m for len observations, and a particular factor variable f, and a function class.ind to construct a matrix of class indicator values (i.e. for a vector vi of observations, construct a matrix xij where xij is 1 if observation i came from condition j, and zero otherwise), I think that:


  anova <- aov(m)
  ci <- class.ind(data[[f]])
  ci <- ci[,colSums(ci) != 0]
  contrasts <- ci %*% diag(1/colSums(ci)) %*% (diag(len)-
(1/len)*matrix(rep(1,len*len), nrow=len))
  ses <- se.contrast(anova, contrasts)
gives me a vector ses of the standard errors corresponding to the sum contrasts in my system, including the degenerate one. (As seems to be standard in this kind of endeavour, the effort per net line of code is huge; please do not think that I wrote these five lines of code off the top of my head. Thanks to denizens of the r-help mailing list and in particular to Greg Snow for his answer to my question about this).

So, this looks like total victory! Why have I described this as only a score draw? Well, because while the above recipe works for a single factor variable, in the case I am actually dealing with I have all sorts of interaction terms between factors, and between factors and numerical variables, and again I want to display and examine all the contrasts, not just some subset of them chosen so that the system of equations to solve is nondegenerate. This looked sufficiently challenging, and the analysis to be done looked sufficiently peripheral to the current business focus, that it's been shelved, maybe for a rematch in the new year.

My weekly diary schedule has already slipped! In my defence, last week was exceptional, because the major activity was neither entrepreneurial nor academic, but practical and logistical: moving house. A lengthy rant about the insanity of the English conveyancing system is probably not of great interest to readers of this diary, so I will save the accumulated feelings of helplessness and insecurity for some other outlet.

Meanwhile, back to work. It's the start of teaching next week; fortunately, I am teaching largely the same material as last year, so now is the time that I can reap the benefits of preparation time that spent on the course over the last two years. Inevitably, there will be new things to include and outdated material to remove or update, but by and large I should be able to deliver the same content.

This is a relief, because of course this year I only have one fifth of my time on academic-related activities. This means that various things have to be sacrificed or delegated, not least some of my extra-curricular activities such as being release manager of SBCL – so I'm very glad that Juho Snellman has volunteered to step in and do that for the next while. (He suffered the by-now traditional baptism of fire, dealing with amusing regressions late in the 1.0.42.x series, and released version 1.0.43 today; we'll see how his coefficient of grumpiness evolves over the next few months).

In the land of industry, what I've mostly been doing is drawing graphs. As the screenshot in last week's my previous entry suggests, I'm using R for data processing and visualisation; I have datasets with large numbers of variables, and the facilities for visualising those quickly and compactly with the lattice package (implementing Becker and Cleveland's trellis paradigm) are very convenient. By and large, progressing from prototype visualisation to presentation- or publication-quality visualisation is also straightforward, but I spent so long figuring out one thing that I needed to do this week that I'll document it here for posterity: that thing was to construct a graph using lattice with an axis break. It's not that it's absurdly difficult – there are plenty of hookable or parameterisable functions in the lattice graph-drawing implementation; the difficult part is finding out which functions to override, which hooks to use, and which traps to avoid.

The problem as I have found it is that when drawing a lattice plot, for these purposes, things happen in an inconvenient order. First, the axes, tickmarks and labels are drawn, using the axis function provided to the lattice call (or axis.default by default; then the data are plotted using the panel function. So, that would be fine; one could even hackily draw over the axis in the panel function to implement the axis break, at least if one remembers to turn clipping off with clip=list(panel="off") in par.settings. Except that the axis function doesn't actually draw the axis lines; instead, there's a non-overridable bit of plot.trellis which draws the box around the plot, effectively being the axis lines – and that happens after everything else.

So, piling hack upon hack: there's no way of not drawing the box. There is, however, a way of drawing the box with a line thickness of zero: pass axis.line=list(lwd=0) in par.settings as well. Ah, but then the tick marks have zero thickness too. Oh, but we can override that setting of axis.line$lwd within our custom axis function. (Each of these realisations took a certain amount of time, experimentation, and code reading to come to pass...). What it boils down to, in the end, is a call like


xyplot(gmeans.zoo,
       screens=1, col=c(2,3,4), lwd=2, lty=3, more=TRUE,
       ylim=c(0.5,1.7), scales=list(
                          x=list(at=dates, labels=date.labels),
                          y=list(at=c(0.5,1.0,1.5),
                            labels=c("- 50%", "± 0%", "+ 50%", "+ 100%"))),
       key=list(lines=list(col=c(2,3,4)),
         text=list(lab=c("5m", "500k", "galileo"))),
       xlab="Date",
       par.settings = list(axis.line=list(lwd=0),
         clip=list(panel="off", strip="off")),
       axis=function(side, scales, components, ...) {
         print(scales)
         lims <- current.panel.limits()
         trellis.par.set(axis.line=list(lwd=0.5))
         panel.axis(side=side, outside=TRUE, at=scales$at,
                    labels=scales$labels,
                    draw.labels=side %in% c("bottom", "left"), rot=0)
         panel.lines(lims$xlim[[1]], lims$ylim, col=1, lwd=1)
         panel.lines(lims$xlim[[2]], lims$ylim, col=1, lwd=1)
         panel.lines(c(lims$xlim[[1]], as.Date("2010-09-11")+0.45),
                     lims$ylim[[1]], col=1, lwd=1)
         panel.lines(c(lims$xlim[[2]], as.Date("2010-09-11")+0.55),
                     lims$ylim[[1]], col=1, lwd=1)
         panel.lines(c(lims$xlim[[1]], as.Date("2010-09-11")+0.45),
                     lims$ylim[[2]], col=1, lwd=1)
         panel.lines(c(lims$xlim[[2]], as.Date("2010-09-11")+0.55),
                     lims$ylim[[2]], col=1, lwd=1)
       },
       panel=function(x,y,...) {
         xs <- current.panel.limits()$xlim
         ys <- current.panel.limits()$ylim
         panel.xyplot(x,y,...)
         panel.polygon(as.Date("2010-09-11")+c(0.4,0.6,0.6,0.4),
                       c(ys[1]+0.05,ys[1]+0.05,ys[2]-0.05,ys[2]-0.05),
                       border="white", col="white", alpha=1)
         panel.lines(xs,1,col=1,lty=3)
         panel.lines(as.Date("2010-09-11")+c(0.5,0.6),
                     c(ys[1]-0.025,ys[1]+0.025), col="black")
         panel.lines(as.Date("2010-09-11")+c(0.4,0.5),
                     c(ys[2]-0.025,ys[2]+0.025), col="black")
         panel.lines(as.Date("2010-09-11")+c(0.5,0.6),
                     c(ys[2]-0.025,ys[2]+0.025), col="black")
         panel.lines(as.Date("2010-09-11")+c(0.4,0.5),
                     c(ys[1]-0.025,ys[1]+0.025), col=1, lwd=1)
         panel.text(as.Date("2010-09-20"),
                    t(gmeans.zoo[1,])+c(0.01,0,-0.01),
                    sprintf("%2.0f%%", round(100*t(gmeans.zoo[1,]-1))),
                    pos=4)
       })

allows me to draw a picture like

for us to show to interested parties.

New laptop video (intel 855GM) drivers holding up pleasantly well. One crash so far, from attempting to play a video; I haven't tried to reproduce it, since that's not something I do very often in any case. (Said video was of my own extremely minor contribution to UK televisual arts programming, so maybe my video drivers were wisely preventing me from narcissistic overexposure...)

Meanwhile, back in the land of sort-of-industrial research and development, I've been using (and simultaneously learning) R for analysing the meaty chunks of data that my colleagues are generating. A couple of my academic labmates were already R users, unashamedly using R for their data treatment needs, even when their data came from Lisp programs. Shocking, I know. So, when substantial datasets started landing on my lap a few months ago (not just from a group of people in mobile broadband; energy use data and computer usage metrics also crossed my desk) I decided that the time was ripe to learn some new tools and techniques.

Initial reactions: mostly positive. To help put my observations into context: I've dabbled in MATLAB before, and it's never quite stuck. The everything-is-a-matrix aspect was painful; graphical output was OK but nothing to write home about; and environmental support pretty painful. (GNU Octave suffers from all of these problems too, and then some; it does have the significant advantage, from my point of view, that at least there isn't the attempted lock-in from purchasing a bare proprietary wrapper over BLAS and LAPACK with the option to buy yet more functionality wrapping BLAS and LAPACK in slightly different ways). I've also used (a long, long time ago now) IDL, again a vector-oriented language; my memory of it is mostly faded, but I remember being satisfied with its graphing facilities and much more satisfied with an Emacs-mode than with its default User Interface. This was 1998; I dare say things would be much the same today...

By contrast, R has data types that are mostly comfortable to my inner Lisp programmer. Yes, number crunching is best done through vectors (matrices being a thin wrapper around vectors rather than a distinct language data type), but lists of vectors are fine, and are used to collect data into data.frames. It has a lightweight object system, with single dispatch on the class of the first argument; mind you, classes of objects are a pretty mutable concept in R, settable at arbitrary points in program execution (there's mostly no relationship between object class and object contents). Speaking of settable, there's the `<-` operator both for assignment and for mutation, like Common Lisp's setf. “Mutation” there might actually not be quite the right word; the evaluation semantics are mostly lexical binding and call-by-value, with the interpreter attempting to perform copy-on-write and deforestation optimizations. Speaking of interpreters, there's a reified environment and call stack at all stages of program execution, which makes the language mildly tricky to compile, particularly since environments are just about the only thing in the language which can be mutated; this aspect of the language has recently been the subject of discussion in the R community (and, would you believe it, one of the camps is advocating a rewrite of the engine using Common Lisp both as a model and as an implementation language, mentioning SBCL by name... sadly without increasing my citation count. Oh well.)

In any case, once I saw the reified environments, and also spotted parse (analogue to read) and eval, I started wondering whether the comint-based Emacs mode for R, Emacs Speaks Statistics, could be enhanced or replaced with some SLIME-like functionality. I was particularly interested in whether there was enough introspective capability to replace the default debugger, which I was not having much success in using at all. So I started digging into the documentation, finding first try, then tryCatch (OK, that's like handler-case, so far so normal), and then on the same help page found withCallingHandlers and withRestarts, direct analogues of handler-bind and restart-case. At that point it seemed logical, instead of trying to write some SLIME-like functionality for ESS, to simply write an R backend for SLIME. With some careful (ahem) adaptation of a stub backend Helmut Eller wrote for Ruby, I got cracking, and swankr now supports a slime REPL, slime scratch buffers, SLDB, the default inspector, and presentations.

Here's a teaser screenshot, demonstrating functionality not yet merged: image output for lattice objects, presented in the SLIME REPL, and usable in subsequent input. There's plenty in swankr that doesn't work, and plenty more that's buggy, but it might be good enough for the enthusiastic R/Lisp/Emacs crossover community to give it a try.

Lattice Presentations

(No, my screen is not that tall.)

In my new life as an itinerant entrepreneur, spending significant amounts of time both working from strange places and travelling on trains, my productivity depends at least in part on having a good development laptop. In my other current life lecturing in Creative Computing, I also need to be able to display reliably to a data projector. Neither of these was a problem until relatively recently.

For the last four years or so, after obtaining a recommendation from one of the masters of Linux and laptops, I've used an IBMlenovo X40: light, fairly rugged (has survived at least one drop), almost full-sized keyboard, and – importantly – fits into more or less any carrying equipment. However, the relative instability (which I am not the only one to experience) brought about by the change in Linux and X.org's to Kernel-Mode Switching was beginning to get worrying; the X40 has an Intel 855GM graphics card, which since Intel is participating heavily in KMS support and development, means that new things got turned on early; I follow squeeze (Debian testing), which gives me some of the thrills and spills of being on the leading edge of consumer Linux development; most pertinently for the start of the new academic year, I've been suffering from odd display corruption on the VGA output.

In various combinations of Xorg and kernel versions, trying to work out a set of versions that work for me, I have experienced

  • intermittent (about once a day) X server crashes, leaving the hardware in a sufficiently inconsistent state that the X server refuses to start again (linux v2.6.33ish, xserver-xorg-video-intel 2.9.1)
  • failure to start the X server on boot at all (linux v2.6.33ish, xserver- xorg-video-intel 2.12.0+legacy1-1)
  • missing mouse cursor on X server start (linux v2.6.35, xserver-xorg- video-intel 2.9.1)
  • substantial (~0.5s) latency about once every 10 seconds, with kslowd or kworkqueue processes taking about 20% of the CPU (linux v2.6.35 and v2.6.36-rc3, xserver-xorg-video-intel 2.9.1)

The good news is that a large number of Intel-graphics-related fixes appeared in Linus' master branch yesterday, and at least some of these problems are fixed; with a module parameter poll=0 for the kms_drm_helper module, the 0.5s latencies are gone (put options kms_drm_helper poll=0 in a file in /etc/modprobe.d). The VGA corruption I was experiencing seems to have been fixed somewhere between Linux versions 2.6.33 and 2.6.35; I have hopes, too, that the X server crashes might be substantially less frequent (but I haven't been running a single kernel long enough to check yet). The one remaining issue that is definitely still present is the missing mouse cursor when X first starts; a suspend/resume cycle works around this fairly reliably for me. Thank you to all those who've worked on fixing these problems.

I'm particularly glad that all of this is just about sufficiently fixed, because the alternative would have been to get a new laptop, and as well as my natural disinclination to purchase new stuff when not strictly necessary (and let's face it, the initial phases of a startup are not usually those where money is abundant), it seems to be largely impossible to get a new one with the form factor of the X40: with the growing prevalence of widescreens, just about every modern laptop is substantially bigger than this one, and thus would not fit in some of my carrying equipment. I will just have to preserve this one as long as possible.

I'm on a train! A train, heading towards Christchurch, with an amusingly bad synthesized (at least, I hope it's synthesized) announcement of train stops.

Being on a train would not normally be worthy of a diary entry. However, I'm travelling not as part of my heretofore stable academic enquiries, but as a mostly-fledged member of the high-flying technological startup community. No, I haven't quite sold out; as of yesterday, I work four days a week for a bright-futured company in the general area of mobile and wireless broadband; the other working day will continue to be spent at Goldsmiths, where I retain most of my responsibilities (but not the management of the music-informatics OMRAS2 project, which finished at the end of August); this arrangement with Goldsmiths will continue for a year.

So, why? A number of factors came together to make this a very attractive opportunity. Firstly, as I've already mentioned, the research project whose technical side I was responsible for came to an end last month; although the project itself was interesting, and its existence was at least partly responsible for me having a permanent academic position, I think it's fair to say that there were all sorts of wetware headaches as a result. The end of the project was therefore a natural point to start afresh — and in particular, to try to focus on aspects of work that I enjoy. Secondly, my PhD students are largely getting towards the end of their studies; most should be submitting their dissertations in the next few months. While PhD students are usually an asset to an academic, it is true that they can take up significant amounts of time, particularly in the early stages; not having to guide any in the early stages of their research next year gives me the freedom to investigate other activities, and to try to broaden my experience. Thirdly, when the person who comes calling is as awesome as our CTO, and when the rest of the gang is full of names I recognize, well, I expect to learn a lot. (I already have!)

And of course, the fact that some of the underlying technology used in the company uses is software that I'm fairly familiar with is attractive; it would be good to show the world that there are more SBCL-using companies with a healthy business than just the poster child. If I'm lucky, the Free Software and Lisp-related content on this blog should noticeably increase, because although the academic life gives a huge freedom to think, it doesn't actually give a large amount of time to do very much with those thoughts; in this newly-acquired business role, as well as the inevitable all- hands pitching in until the small hours to make the demos work with bits of string and glue, I hope that there will be time to implement and reflect on interesting things.

It's been a while. It's in fact almost embarassingly late for me to be blogging now about the 2010 European Lisp Symposium, which was over a month ago now – in my defence, I point to the inevitable stress-related illness that follows excessive concentration on a single event, coupled with hilarious clashing deadlines at work and in my outside activities.

So, we cast our minds back to April. When I booked my flight to Lisbon, I deliberately chose not to fly with British Airways, on the basis that they were likely to strike. It turned out that the activities of British Airways cabin crew was going to be the least of the problems associated with getting to an international conference...

Yes, shortly after I booked my tickets, Eyjafjallajökull went *boom* and most of Europe's airspace closed for a week, with ongoing disruption for the best part of a month. There were moments during the disruption when I wondered whether there was an actual curse affecting ELS, but in the event things cleared up and there was only minimal disruption to delegates and speakers, both getting there and getting back.

But enough about transport! How was the symposium itself? Well, I enjoyed the programme – but given that I had as much control over it as events would allow me, perhaps that's not the most unbiased endorsement of quality. Still, I was entertained by all the keynote speakers, from the window into the business world opened by Jason Cornez of Ravenpack, via the practical philosophy and view on history afforded by Kent Pitman, to the language experimentation and development in PLT Scheme Racket described enthusiastically by Matthias Felleisen. Pascal Costanza's tutorial on parallel and concurrent programming was highly informative, and there was a good variety of technical talks.

It's often said, though, that the good stuff at conferences happens between the programme, and for that the local organization needs to be on the ball. I'm glad to say that António Leitão and Edgar Gonçalves, and their helpers, enabled a huge amount of interaction: lunch, coffee and tea breaks, evening meals (including a fabulous conference banquet, but also a more informal dinner and a meal punctuated by Fado. I gather the excursion to Sintra on the Saturday was interesting; by then I was at the airport, looking nervously at the departure boards...

I enjoyed meeting and talking with many of the attendees; some whose names I knew but whose faces I didn't (and some whose names I knew because they shared them with other people who I knew already: take a bow, both Luís Oliveiras); but with always one eye on the next thing that could go wrong, I didn't get to go very deeply into interesting conversations. Maybe next year, in Hamburg for ELS2011 (expect the paper deadline to be around 31st December 2010) – in the meantime, there's likely to be a journal special issue with an open call for papers, coming soon, and of course the ALU are holding an International Lisp Conference this year, whose call for papers is currently open. So get writing!

146 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!