Older blog entries for oubiwann (starting at number 297)

11 Apr 2013 (updated 13 Apr 2013 at 05:04 UTC) »

The Lambda Calculus: A Quick Primer

The λ-Calculus Series
  1. A Brief History
  2. A Quick Primer for λ-Calculus
  3. Reduction Explained
  4. Church Numerals
  5. Arithmetic
  6. Logic
  7. Pairs and Lists
  8. Combinators
To the untrained eye, the notation used in λ-calculus can be a bit confusing. And by "untrained", I mean your average programmer. This is a travesty: reading the notation of λ-calculus should be as easy to do as recognizing that the following phrase demonstrates variable assignation:
x = 123
So how do we arrive at a state of familiarity and clarity from a starting state of confusion? Let's dive in with some examples, and take it step at a time :-) Once we've got our heads wrapped around Alonzo Church's notation, we'll be able to easily read it -- and thus convert it into code! (We will have lots of practice in the coming posts to do just that.)

A Quick Primer for λ-Calculus

Here's one of the simplest definitions in λ-calculus that you're going to see: the identity function:
λx.x
This reads as "Here is a function that takes x as an argument and returns x." Let's do some more:
λxy.x
"Here is a function that takes x and y as arguments and returns only x."
λx.λy.xy
"An outer function takes x as an argument and an inner function takes y as an argument, returning the x and the y." Note that this is exactly equivalent to the following (by convention):
λxy.xy
Let's up the ante with a function application:
λf.λx.f x
"Here is a function that takes a function f as its argument; the inner function takes x as its argument; return the result of the function f when given the argument x." For example, if we pass a function f which returns its input multiplied by 2, and we supplied a value for x as 6, then we would see an output of 12.

Let's take that a little further:
λf.λx.f (f (f x))
"Here is a function that takes a function f as its argument; the inner function takes x as its argument. Apply the function f to the argument x; take that result and apply f to it. Then do it a third time, returning that result." If we had the same function as the example above and passed the same value, our result this time would be 48 (i.e., 6 * 2 * 2 * 2).

That's most of what you need to read λ-calculus expressions. Next we'll take a peek into the murky waters of  λ-calculus reduction and find that it's quite drinkable, that we were just being fooled by the shadows.


Syndicated 2013-04-11 04:39:00 (Updated 2013-04-11 15:55:46) from Duncan McGreggor

9 Apr 2013 (updated 18 Jul 2014 at 06:16 UTC) »

Interview with Erlang Co-Creators

A few weeks back -- the week of the PyCon sprints, in fact -- was the San Francisco Erlang conference. This was a small conference (I haven't been to one so small since PyCon was at GW in the early 2000s), and absolutely charming as a result. There were some really nifty talks and a lot of fantastic hallway and ballroom conversations... not to mention Robert Virding's very sweet Raspberry Pi Erlang-powered wall-sensing Lego robot.

My first Erlang Factory, the event lasted for two fun-filled days and culminated with a stroll in the evening sun of San Francisco down to the Rackspace office where we held a Meetup mini-conference (beer, food, and three more talks). Conversations lasted until well after 10pm with the remaining die-hards making a trek through the nighttime streets SOMA and the Financial District back to their respective abodes.

Before the close of the conference, however, we managed to sneak a ride (4 of us in a Mustang) to Scoble's studio and conduct an interview with Joe Armstrong and Robert Virding. We covered some of the basics in order to provide a gentle overview for folks who may not have been exposed to Erlang yet and are curious about what it has to offer our growing multi-core world. This wend up on the Rackspace blog as well as the Building 43 site (also on YouTube). We've got a couple of teams using Erlang in Rackspace; if you're interested, be sure to email Steve Pestorich and ask him what's available!


Syndicated 2013-04-09 21:24:00 (Updated 2014-07-18 05:49:45) from Duncan McGreggor

The Lambda Calculus: A Brief History

Over this past weekend I took a lovely journey into the heart of the lambda calculus, and it was quite amazing. My explorations were made within the context of LFE. Needless to say, this was a romp of pure delight. In fact, it was so much fun and helped to clarify for me so many nooks and crannies of something that I had simply not explored very thoroughly in the past, that I had to share :-)

The work done over the past few days is on its way to becoming part of the documentation for LFE. However, this is also an excellent opportunity to share some clarity with a wider audience. As such, I will be writing a series of blog posts on λ-calculus from a very hands-on (almost practical!) perspective. There will be some overlap with the LFE documentation, but the medium is different and as such, the delivery will vary (sometimes considerably).

This series of posts will cover the following topics:
  1. A Brief History
  2. A Primer for λ-Calculus
  3. Reduction Explained
  4. Church Numerals
  5. Arithmetic
  6. Logic
  7. Pairs and Lists
  8. Combinators
The point of these posts is not to expound upon that which has already been written about endlessly. Rather, the hope is to give a very clear demonstration of what the lambda calculus really is, and to do so with clear examples and concise prose. When the gentle reader is able see the lambda calculus in action, with lines of code that clearly show what is occurring, the mystery will disappear and an intuition for the subject matter will quite naturally begin to arise. This post is the first in the series; I hope you enjoy them as much as I did rediscovering λ-calculus :-)

Let us start at the beginning...

A Brief History

The roots of functional programming languages such as Lisp, ML, Erlang, Haskell and others, can be traced to the concept of recursion in general and λ-calculus in particular. In previous posts, I touched upon how we ended up with the lambda as a symbol for the anonymous function as well as how recursion came to be a going concern in modern mathematics and then computer science.

In both of those posts we saw Alonzo Church play a major role, but we didn't really spend time on what is quite probably considered his greatest contribution to computer science, if not mathematics itself: λ-calculus. Keep in mind that the Peano axioms made use of recursion, that Giuseppe Peano played a key role in Bertrand Russell’s development of the Principia, that Alonzo Church sought to make improvements on the Principia, and λ-calculus eventually arose from these efforts.

Invented in 1928, Alonzo didn't publish λ-calculus until 1932. When an inconsistency was discovered, he revised it in 1933 and republished. Furthermore, in this second paper, Church introduced a means of representing positive integers using lambda notation, now known as Church numerals. With Church and Turing both publishing papers on computability in 1936 (based respectively upon λ-calculus and the concept of Turing machines), they proposed solutions to the Entscheidungsproblem. Though Gödel preferred Turing's approach, Rosser suggested that they were equivalent definitions in 1939. A few years later, Kleene proposed the Church Thesis (1943) and then later formally demonstrated the equivalence between his teacher's and Turing's approaches giving the combination the name of the Church-Turing Thesis (1952, in his Introduction to Metamathematics). Within eight years, John McCarthy published his now-famous paper describing the work that he had started in 1958: "Recursive Functions of Symbolic Expressions and Their Computation by Machine". In this paper, McCarthy outlined his new programming language Lisp, citing Church's 77-page book  (1941, Calculi of Lambda Conversion), sending the world off in a whole new direction.

Since that time, there has been on-going research into λ-calculus. Indisputably, λ-calculus has had a tremendous impact on research into computability as well as the practical applications of programming languages. As programmers and software engineers, we feel its impact -- directly and indirectly -- on a regular, almost daily basis.


Syndicated 2013-04-09 05:22:00 (Updated 2013-04-09 05:22:48) from Duncan McGreggor

3 Apr 2013 (updated 4 Apr 2013 at 03:03 UTC) »

Autoscale and Orchestration: the Heat of OpenStack

Several months before I joined Rackspace last year, there were efforts under way to provide an Autoscaling solution for Rackspace customers. Features that we needed in OpenStack and Heat hadn't been released yet, and there were no OpenStack experts on the Autoscaling team. As such, the engineers began developing a product that met Rackspace customer needs, integrated with the
existing monitoring and load-balancing infrastructure, and made calls to OpenStack Nova APIs as part of the scaling up and down process.

At PyCon this year, Monty Taylor, Robert Collins, Clint Byrum, Devananda van der Veen, and I caught up and chatted about what their views were of the current status of autoscaling support in OpenStack Heat. It seems that the two pieces we need the most -- LBaas and support for external monitoring systems (perhaps via webhooks) -- are nascent and not ready for prime-time yet. Regardless, Monty and his team encouraged us to dive into Heat, contribute patches, and in general, release our work for consumption by other Stackers.

Deeply encouraged by these interactions, we took this information to Rackspace management and, to quote Monty Python, there was much rejoicing. Obviously OpenStack is huge for Rackspace. Even more, there is a lot of excitement about Heat, the existing autoscaling features in OpenStack, and getting our engineers involved and contributing to these efforts.

In the course of these conversations, we discovered that Heat was getting lots of attention internally. It turns out that another internal Rackspace project had been doing something pretty cool: they were experimenting with the development of a portable syntax for application description and deployment orchestration. Their work had started to converge on some of the functionality provided by Heat, and they had a similar experience as the Autoscaling team. The timing was right to contribute what they have learned and align all of their continued efforts with adding value to Heat.

Along these lines, we are building two new teams that will focus on Heat development: one
contributing to features related to autoscaling (not necessarily limited to Heat) and the other contributing to the ongoing conversations regarding the separation of concerns between orchestration and configuration management. Everyone -- from engineers to management -- is very excited about this new direction in which our teams are moving. Not only will it bring new developers to OpenStack, but it is aligning our teams with Rackspace's OpenStack roots and the company's vision for supporting the growing cloud community.

Simply put: we're pretty damned pumped and looking forward to more good times with OpenStack :-)


Syndicated 2013-04-03 17:03:00 (Updated 2013-04-04 02:19:53) from Duncan McGreggor

3 Apr 2013 (updated 21 Nov 2014 at 21:14 UTC) »

Maths and Programming: Whence Recursion?

As a manager in the software engineering industry, one of the things that I see on a regular basis is a general lack of knowledge from less experienced developers (not always "younger"!) with regard to the foundations of computing and the several related fields of mathematics. There is often a great deal of focus on what the hottest new thing is, or how the industry can be changed, or how we can innovate on the decades of profound research that has been done. All noble goals.

Notably, another trend I've recognized is that in a large group of devs, there are often a committed few who really know their field and its history. That is always so amazing to me and I have a great deal of admiration for the commitment and passion they have for their art. Let's have more of that :-)

As for myself, these days I have many fewer hours a week which I can dedicate to programming compared to what I had 10 years ago. This is not surprising, given my career path. However, what it has meant is that I have to be much more focused when I do get those precious few hours a night (and sometimes just a few per week!). I've managed this in an ad hoc manner by taking quick notes about fields of study that pique my curiosity. Over time, these get filtered and a few pop to the top that I really want to give more time.

One of the driving forces of this filtering process is my never-ending curiosity: "Why is it that way?" "How did this come to be?" "What is the history behind that convention?" I tend to keep these musings to myself, exploring them at my leisure, finding some answers, and then moving on to the next question (usually this takes several weeks!).

However, given the observations of the recent years, I thought it might be constructive to ponder aloud, as it were. To explore in a more public forum, to set an example that the vulnerability of curiosity and "not knowing" is quite okay, that even those of us with lots of time in the industry are constantly learning, constantly asking.

My latest curiosity has been around recursion: who first came up with it? How did it make it's way from abstract maths to programming languages? How did it enter the consciousness of so many software engineers (especially those who are at ease in functional programming)? It turns out that an answer to this is actually quite closely related to a previous post I wrote on the secret history of lambda. A short version goes something like this:

Giuseppe Peano wanted to establish a firm foundation for logic and maths in general. As part of this, he ended up creating consistent axioms around the hard-to-define natural numbers, counting, and arithmetic operations (which utilized recursion).  While visiting a conference in Europe, Bertrand Russell was deeply impressed by the dialectic talent of Peano and his unfailing clarity; he queried Peano as to his secret for success (Peano told him) and them asked for all of his published works. Russell proceeded to study these quite deeply. With this in his background, he eventually co-wrote the Principia Mathematica. Later, Alonzo Church (along with his grad students) sought to improve upon this, and in the process Alonzo Church ended up developing the lambda calculus. His student, John McCarthy, later created the first functional programming language, Lisp, utilizing concepts from the lambda calculus (recursion and function composition).

In the course of reading between 40-50 mathematics papers (including various histories) over the last week, I have learned far more than I had originally intended. So much so, in fact, that I'm currently working on a very fun recursion tutorial that not only covers the usual practical stuff, but steps the reader through programming implementations of the Peano axioms, arithmetic definitions, the Ackermann function, and parts of the lambda calculus.

I've got a few more blog post ideas cooking that dive into functions, their history and evolution. We'll see how those pan out. Even more exciting, though, was having found interesting papers discussing the evolution of functions and the birth of category theory from algebraic topology. This, needless to say, spawned a whole new trail of research, papers, and books... and I've got some great ideas for future blog posts/tutorials around this topic as well. (I've encountered category theory before, but watching it appear unsearched and unbidden in the midst of the other reading was quite delightful).

In closing, I enjoy reading not only the original papers (and correspondence between great thinkers of a previous era), but also the meanderings and rediscoveries of my peers. I've run across blog posts like this in the past, and they were quite enchanting. I hope that we continue to foster that in our industry, and that we see more examples of it in the future.

Keep on questing ;-)


Syndicated 2013-04-03 00:33:00 (Updated 2014-11-21 21:10:05) from Duncan McGreggor

13 Mar 2013 (updated 13 Mar 2013 at 15:03 UTC) »

Erlang Meetup at Rackspace, San Francisco

I'm very pleased to announce that Rackspace's San Francisco office will be hosting an Erlang Meetup immediately after the closing session of last day of SF Erlang Factory on 22 March.

We are honored to announce that both Robert Virding and Francesco Cesarini will be present for the event. The night will open up at 6:30pm with Mediterranean food, refreshments (and I hear there will be a keg). Around 7:15pm we will gather for a presentation by the very talented folks at Boundary which will start at 7:30pm.

The Boundary presentation will be followed by one from Robert Virding. We'll finish the night with a door prize give-away of two copies of Francesco and Simon's book Erlang Programming: A Concurrent Approach to Software Development. Perhaps the winners might be able to talk Francesco into signing their copies? ;-)

Last but not least, some of Rackspace's own Erlangers will be on hand for conversations about how we're using Erlang/OTP to deliver Fanatical Support in in our Data Center and Cloud Ops groups.

We're going to start with a cap of 60 people for the event, but if there's high demand, we have room to adjust this. Due to some communications hiccups, we're going to do the planning for this event the old-fashioned way: send an email to duncan.mcgreggor@rackspace.com, and let me know that you'd like to attend. Your name will be added to the list in the order I receive emails. The list of attendees and the waiting list are published here and will be refreshed with each new addition.

Update! The event is now published on meetup.com -- Thanks for all your help, Andra!

As special thanks go out to Robert Virding, Phil Toland, and Brian Troutwine -- conversations with them were the inspiration for putting this event together. Thanks guys! (Note that I've put that background info into its own blog post, so as not to confuse the event announcement ... to much ;-) ).

See you at Erlang Factory SF and then on Friday at the Rackspace office!


Syndicated 2013-03-13 13:07:00 (Updated 2013-03-13 14:38:57) from Duncan McGreggor

Lisp Flavored Erlang


I've flirted with Lisp since the 90s, really started getting into it around 2008 when I started playing with genetic programming, and more recently investigated Common Lisp, Scheme (Gambit and Chicken), and Clojure with the intent of discovering the best way to write distributed programs in Lisp. (I even seriously explored implementing chunks of Twisted in Common Lisp. If only I had more time...)

Needless to say, I kept coming back to Erlang as it is a natural choice for the sort of concurrent programming I was interested in doing. On several occasions, I'd run across Robert Virding's Lisp-2 that he had written on top of the Erlang VM. At first blush, this appeared quite promising. Yet faced with the perceived burden of learning Erlang, I consistently set it aside for a future date.

"Excuse me, could I have some Erlang, please? Yes, just a cup. Oh, and can I get that Lisp-flavored? Thanks so much."

After bouncing between Clojure and CL, after running into difficulties with Termite on Chicken Scheme, and finally, after being quite impressed with the efforts made by the Elixir folks (who I believe took inspiration from LFE!), I decided to give LFE a chance. Within minutes of that decision, I came to two conclusions:
  1. LFE is brilliant.
  2. LFE needs docs like Elixir... and tutorials... and exposure! Why haven't I been using LFE all along?!
At which point, I started hacking on some docs to see if I could stick with it. When, after a few days, I proved to myself that I could, I contacted Robert and let him know not only how much I adored his masterpiece, but that I really wanted to write tons and tons of docs for it so that anyone could pick it up and start using it right away. I wanted to write docs for an audience like me, that didn't know Erlang, who weren't Lisp gurus.

This seemed like a doable goal, since I had about 5 minutes' worth of Erlang experience at the time I was having these conversations with Robert. I was learning Erlang at a rapid pace simply by virtue of the Lisp hook that LFE provided.

Our interactions led to the publicizing of the new docs site for Lisp Flavored Erlang on the LFE google groups list. We also created a Twitter account (we both have full access to it, but I tend to maintain it) whose sole purpose is to bring LFE to more people, keep the news around LFE fresh, etc.

"I could have sworn you just said 'Lisp'..."

A side note about Lisp: S-expressions are concise and elegant. Code and data using the same form is quite powerful. I do believe that the technology industry has the capacity to realize that old biases against Lisp are just that: old and outdated. The many dialects of Lisp are anything but. Clojure and (I believe) LFE are perfect examples of this. Whole new generations of programmers are delighting in the expressive power of a language whose roots can be traced back to actual manipulations of memory registers.

To resume the narrative: in the course of various efforts focused on documenting LFE, asking questions on the mail list, and having various other discussions, Robert pointed out that some of my coworkers at Rackspace had been working on Erlang projects. I subsequently reached out to Phil Toland. Then, within minutes of this (and entirely coincidentally), Kai Janson emailed a group of us about Erlang Factory SF and his desire to provide Erlang workshops for engineers at Rackspace.

This led to further conversations with Robert, then with Francesco, with several Rackers signing up for Erlang Factory this year, and finally, with me volunteering to put a Meetup together afterwards, hosted at Rackspace's SF office (more on that in a few hours).

For the curious, I do continue to work in Python and Twisted; I am excited about the new async support that Guido is spearheading for Python 3 and which has electrified so many hard-core Python hackers. Similarly, I continue to hack on Common Lisp projects. However, I am quite delighted that I have found a way to interface with Erlang which matches how I think, matches my aesthics. And finally, I look forward to many fruitful years of LFE in my life :-)

Thanks Joe! Thanks Mike! Thanks Robert!


Syndicated 2013-03-13 06:17:00 (Updated 2013-03-13 06:17:01) from Duncan McGreggor

Unbeatable Culture at Rackspace

Culture is hugely important to me in any company I want to be a part of. Having telecommuted for many years, working in an office environment is a big step. I need to be lured, enticed, and teased. Something needs to overcome the massive convenience of remote work. It's a big lifestyle change.

A company's culture has been one of the crucial factors in evaluating such a big shift in my daily work. The culture is what makes it worth while -- and fun! -- to endure a commute, work in a shared environment, deal with myriad distractions, etc.

Having spent time visiting and hanging out in many of the top-tier start-up (and established) company offices in the Bay Area, I was very impressed with what Rackspace had to offer: staff, designers, sales, and engineers who were highly gifted, loved their jobs, were fanatically dedicated to their company, and didn't take themselves too seriously. After a long and romantic courtship, I finally took the plunge. Knowing that life is full of disappointment, I was fully aware that this was a risk.

As such, I was pleasantly surprised to find that after a month at Rackspace, the cultural experience was a genuine one. However, I was blown away when I attended Rookie O at the beginning of January and experienced the depth of what Rackspace has done for the past 15 years, what they offer on the whole as a company, and how they keep everything real and fun. It was such an eye-opener for me that I had to blog about it. Here's a teaser for you:
"This is what I and so many of my peers have been searching for in a technical company: a place with heart, room to grow and a vast frontier in front of us. The key thing, though, is the heart. That’s the support for everything else, and it’s something that often ends up riding in the back of the bus on a trip to nowhere. Rackspace has the heart driving the bus. This is not a conscious technical criteria for so many of us when seeking employment, but it is the reason that we grow so dissatisfied at other companies and why Rackspace has been such a surprise for me."
Incidentally, while drafting that post, two other interesting things happened:
  1. Fortune magazine named Rackspace one of the best 100 companies to work for in 2012, and
  2. Bloomberg released an article about Rookie O (the same class I was in, oddly; also, note that it's targeted at the "cool," cynical audience ;-) )
In 2011 we made Fortune's Top 100 list as well, coming in at #74. This year, we jumped 40 places on that list to #34 :-) How cool is that?!

Rackspace is always hiring, and our San Francisco office is growing as well. We do everything from Twisted and Python to iOS, Node.js, Lua and Java. We're in the cloud, on the desktops, and jumping into mobile devices. If any of this excites you, check out the openings!


Syndicated 2013-01-24 16:47:00 from Duncan McGreggor

28 Dec 2012 (updated 21 Nov 2014 at 21:14 UTC) »

The Secret History of Lambda

Being a bit of an origins nut (I always want to know how something came to be or why it is a certain way), one of the things that always bothered me with regard to Lisp was that no one seemed to talking about the origin of lambda in the lambda calculus. I suppose if I wasn't lazy, I'd have gone to a library and spent some time looking it up. But since I was lazy, I used Wikipedia. Sadly, I never got what I wanted: no history of lambda. [1] Well, certainly some information about the history of the lambda calculus, but not the actual character or term in that context.

Why lambda? Why not gamma or delta? Or Siddham ṇḍha?

To my great relief, this question was finally answered when I was reading one of the best Lisp books I've ever read: Peter Norvig's Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp. I'll save my discussion of that book for later; right now I'm going to focus on the paragraph at location 821 of my Kindle edition of the book. [2]

The story goes something like this:
  • Between 1910 and 1913, Alfred Whitehead and Bertrand Russell published three volumes of their Principia Mathematica, a work whose purpose was to derive all of mathematics from basic principles in logic. In these tomes, they cover two types of functions: the familiar descriptive functions (defined using relations), and then propositional functions. [3]
  • Within the context of propositional functions, the authors make a typographical distinction between free variables and bound variables or functions that have an actual name: bound variables use circumflex notation, e.g. x̂(x+x). [4]
  • Around 1928, Church (and then later, with his grad students Stephen Kleene and J. B. Rosser) started attempting to improve upon Russell and Whitehead regarding a foundation for logic. [5]
  • Reportedly, Church stated that the use of  in the Principia was for class abstractions, and he needed to distinguish that from function abstractions, so he used x [6] or ^x [7] for the latter.
  • However, these proved to be awkward for different reasons, and an uppercase lambda was used: Λx. [8].
  • More awkwardness followed, as this was too easily confused with other symbols (perhaps uppercase delta? logical and?). Therefore, he substituted the lowercase λ. [9]
  • John McCarthy was a student of Alonzo Church and, as such, had inherited Church's notation for functions. When McCarthy invented Lisp in the late 1950s, he used the lambda notation for creating functions, though unlike Church, he spelled it out. [10] 
It seems that our beloved lambda [11], then, is an accident in typography more than anything else.

Somehow, this endears lambda to me even more ;-)



[1] As you can see from the rest of the footnotes, I've done some research since then and have found other references to this history of the lambda notation.

[2] Norvig, Peter (1991-10-15). Paradigms of Artificial Intelligence Programming: Case Studies in Common Lisp (Kindle Locations 821-829). Elsevier Science - A. Kindle Edition. The paragraph in question is quoted here:
The name lambda comes from the mathematician Alonzo Church’s notation for functions (Church 1941). Lisp usually prefers expressive names over terse Greek letters, but lambda is an exception. Abetter name would be make - function. Lambda derives from the notation in Russell and Whitehead’s Principia Mathematica, which used a caret over bound variables: x( x + x). Church wanted a one-dimensional string, so he moved the caret in front: ^ x( x + x). The caret looked funny with nothing below it, so Church switched to the closest thing, an uppercase lambda, Λx( x + x). The Λ was easily confused with other symbols, so eventually the lowercase lambda was substituted: λx( x + x). John McCarthy was a student of Church’s at Princeton, so when McCarthy invented Lisp in 1958, he adopted the lambda notation. There were no Greek letters on the keypunches of that era, so McCarthy used (lambda (x) (+ xx)), and it has survived to this day.
[3] http://plato.stanford.edu/entries/pm-notation/#4

[4] Norvig, 1991, Location 821.

[5] History of Lambda-calculus and Combinatory Logic, page 7.

[6] Ibid.

[7] Norvig, 1991, Location 821.

[8] Ibid.

[9] Looking at Church's works online, he uses lambda notation in his 1932 paper A Set of Postulates for the Foundation of Logic. His preceding papers upon which the seminal 1932 is based On the Law of Excluded Middle (1928) and Alternatives to Zermelo's Assumption (1927), make no reference to lambda notation. As such, A Set of Postulates for the Foundation of Logic seems to be his first paper that references lambda.

[10] Norvig indicates that this is simply due to the limitations of the keypunches in the 1950s that did not have keys for Greek letters.

[11] Alex Martelli is not a fan of lambda in the context of Python, and though a good friend of Peter Norvig, I've heard Alex refer to lambda as an abomination :-) So, perhaps not beloved for everyone. In fact, Peter Norvig himself wrote (see above) that a better name would have been make-function.


Syndicated 2012-12-28 17:34:00 (Updated 2014-11-21 21:12:30) from Duncan McGreggor

Seeking a Twisted Maintainer

Last week we posted on the Twisted Matrix blog about the maintainer position for the Twisted project being open. We are accepting applicants for a motivated and experienced release manager and core contributor. Our core maintainers are getting busier and busier with specialized Twisted work, and don't have the time that they used to be able to dedicate to maintaining Twisted.

The post on the Twisted Matrix blog gives a quick overview of the position; if you're interested, please check out the fellowship proposal for more details and email the address on that page (at the bottom).

Also, feel free to ping glyph, exarkun, or myself (oubiwann) on #twisted-dev on IRC to chat about it more.


Syndicated 2012-12-18 18:02:00 (Updated 2012-12-18 18:02:29) from Duncan McGreggor

288 older entries...

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!