Rob Pike on the irrelevance of systems research

Posted 2 May 2000 at 17:08 UTC by advogato Share This

While browsing the Software Carpentry discussions, I came across this link to a presentation by Rob Pike, arguing that computer systems research is in decline.

Pike and the other AT&T folk have been critical of LInux and free software for some time. I believe this article rasises some good points, but obviously I am not anywhere nearly as pessimistic as he is about free software.

The relation between academic research in computer science and free software is an interesting one. A pure CS education leaves one with the experience of having written lots of 100 line programs, and maybe a few bigger ones. It doesn't teach you to program, while free software does. Yet, a great deal of free software is written by people in the academic environment -- if it weren't for grad students, the catalogue of free software would be a lot poorer.

Discussion welcome, but one request: please read the presentation first.


Thoughts.. , posted 2 May 2000 at 19:53 UTC by listen » (Journeyer)

This is a silly ramble. Don't expect an eloquent rebuttal or anything ;-)

I agree with him that Microsoft has done a lot of research and innovation. But we all know that anything good that MS come up with will be rolled into free systems eventually ( witness bonobo and nautilus/konquerer, which borrow a lot from COM and IE respectively, though all three do have their own innovations ).

Also, it is clearly easier to clone something that works, then change it in response to problems. This is obviously the case with a lot of free software. But if the original implementation is done with thought, then a lot of variation can come about easily. A good example is linux. Al Viro plans to support Plan 9 like namespaces, for example, and a lot of other stuff is clearly better than standard Unix. There are a lot more people willing to implement something that they know works , than just follow a whim. I think we will see linux eclipse most other widespread OSs in terms of elegance and usefulness of its code, and I'm sure that when we stop "playing catchup" in certain areas to commercial unix & NT, there will be a lot more new ideas implemented. But that is some way off.

But I think the real problem he has is this: when there was a lot of OS research, machines capable of running them were new - people were finding out the best ways to do stuff. When something came along that worked pretty well (eg Unix), it was adopted by a lot of people - they relied on its behaviour. So if you want to do something radically different, you really need a good reason to get people to use it. Academics didn't care if their systems were used in the past, but now they seem to. Maybe they feel that Linux or the web shouldn't have been so successful. So now they are doing things that don't need so much persuasion to get used.

I think that most of his comments on Linux are driven by some kind of jealosy of its success wrt. Plan 9. Though linux used a lot of Plan9 design ideas (clone(), dentrys, devfs, etc. ) he calls it "not that hot". I dunno.

Anyway, if people do want to research something, why should they have to write a whole new OS? Are there that many situations where this is necessary? Linux should provide a reasonable base for most OS research, and if it turns out useful, then it can be widespread in a relatively short time.

All IMVeryHO.

Sounds more like agreement than rebuttal, posted 2 May 2000 at 20:35 UTC by dan » (Master)

I agree with him that Microsoft has done a lot of research and innovation. But we all know that anything good that MS come up with will be rolled into free systems eventually

...

Also, it is clearly easier to clone something that works,

...

Though linux used a lot of Plan9 design ideas (clone(), dentrys, devfs, etc. )

I didn't garner the impression from reading the original article that he was in any way critical of free software; he just said that it didn't tend to involve any significant original research. From your followup it looks like you agree ...

erm.. , posted 2 May 2000 at 20:50 UTC by listen » (Journeyer)

I said it was a ramble, not a rebuttal, what more do you want?

reality check, posted 2 May 2000 at 23:00 UTC by graydon » (Master)

I think there are still plenty of projects doing systems research appropriate to modern concerns of safety, networkability, high level abstraction, performance and programmability, which are not "just variations on unix". visible, active examples:

mozart, haskell & cayenne, exokernel, erlang, amorphous, pict, mercury, fiasco, OSKit & fluke, ocaml, ecos, spin, sather, 2k & off, NESL, hurd

his massive oversimplification of the internet and free software deserves mention as well: TCP/IP may have been on a "high end workstation" 10 years ago, but now it's in every home, every classroom, every building, and an increasing number of people's pockets. that's very different, and things have not remained completely static. we have had to think and design our way through security, scalability, availability and management problems which were without precedent in 1990. many of these are ongoing: we are seeing many new ways of using the massive availability of networks as programmable systems, and that's only increasing. likewise, note that while linux is perhaps the most obvious free software project, the availability of the internet for loose, long-term collaboration has spawned dozens of operating system projects, weird extensions and modifications to linux, programming languages, experimental GUIs, etc. It may be true that certain american universities are turning into corporate yes-men, or certain former primary research facilities are giving in to numb-skulled shareholders, but I don't think the world as a whole has given up on systems research.

even his statement on there being no new CPUs is patently false, as is his apparant disintrest in portability. What about PPC, MIPS, SH3, Alpha, StrongARM, Crusoe, etc? What about routers, PDAs, cell phones, video game systems, embedded controllers? All PCs?

maybe pike hasn't bothered to look outside bell labs in a while.

interesting, posted 2 May 2000 at 23:17 UTC by apgarcia » (Journeyer)

Thank you for the link -- this is good stuff. I confess from the start that I have a very high opinion of Rob Pike, and my reaction to this monolog of his is no exception.<p>First, this presentation/paper is a call to arms, a goad to create and take risks. I do not think that Rob Pike has any antipathy towards Linux or Free Software. He says that they demonstrate innovation in their development model but not in the software itself, which is mostly true.<p>Some other insights he shares also quite accurate:<ul><li>standards conformance stifles creativity, both in software and in hardware.<li>in general, we use fewer operating systems today than ten, twenty, n years ago -- think: not just the commercial stuff like vms, rsts, tenex, mpe [hp], mvs, etc., but also academic systems like its, ctss, berkeley timesharing sytem, dartmouth timesharing system, etc. Es verdad.<li>money going into R&D these days wants results and wants them now; researchers have less freedom.<li>"Stanford now encourages students to go to startups because successful CEOs give money to the campus. The new president of Stanford is a successful computer entrepreneur." -- again, he's on the mark. This presentation was given in February -- I don't remember when Hennessy (yes, of the famous Patterson/Hennessy architecture book) was appointed President of Stanford this year, but he, of course, started MIPS. Know what else came from Stanford? VMWare. Yep. The founder left grad school to start the company, and he took a bunch of fellow grad students with him. They were doing very similar research for the <a href=http://www-flash.stanford.edu/>FLASH</a> project at Stanford.<p>Pike also gives a number of great suggestions on where to look for innovations in systems research. To me, this stuff is exciting and positive -- the point is not that Linux and Free Software are bad but that there are lots of other cool things still to do!

AT&T critical of Linux?, posted 2 May 2000 at 23:29 UTC by apgarcia » (Journeyer)

<I>Pike and the other AT&T folk have been critical of LInux and free software for some time. I believe this article rasises some good points, but obviously I am not anywhere nearly as pessimistic as he is about free software.</I><P>I'd be interested in hearing more on this, because my only exposure to someone from that culture was the keynote speech at last year's ALS by Norm Schryer, director of Broadband Services Research at AT&T Shannon Labs. That speech is very relevant to this discussion. He made a number of points that are complementary to Pike's. Perhaps I'll try to summarize it for you later.<P>At any rate, he said that he uses and likes Linux. He also showed off a prototype of a tiny special-purpose computer running Linux, a little box you can carry around that AT&T trusts enough to "expose" its private network to the internet, with the little box providing VPN services.

re: reality check, posted 2 May 2000 at 23:51 UTC by apgarcia » (Journeyer)

I'm not going to argue with you, graydon. Even if I'm right [and I believe that I am], you would probably win only cuz you're smarter than I.

ok, so I kind of sort of lied about not arguing. I just want to make two points:

  1. Pike says that much of systems research today is focused on metrics, or as he says, "Performance minutae and bad charts." To this, I say, amen.
  2. Take exokernel as an example -- well, first another PIke quote: "New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture - and some do - the first thing to build is the Unix emulation layer." Sound familiar?
I'm not saying that you're totally wrong, but I would respectfully ask you to take a few steps back and reconsider how truly innovative are the examples you cite.

[p.s. please forgive the poor formatting in my other messages. they looked ok when I previewed them in emacs/w3!]

Standards, posted 3 May 2000 at 00:48 UTC by raph » (Master)

One of the things that resonated most for me in Pike's presentation was the stifling effect of standards. Indeed, they often force bad compromises, or at the very least add a lot of additional complexity.

Yet, standards are also essential if there is to be any hope of things working, especially in today's world. Modern computer systems are made of lots of different components, with standards used to glue everything together. In the early days, systems were much more monolithic beasts, with just a CPU, some peripherals, an operating system kernel (if that!) and some applications.

What I get out of this situation is that we need better, simpler standards. Making a good standard is hard work. Most smart people wouldn't be caught dead on a standards committee. Most standards organizations have closed processes that do not reward elegance and simplicity. Even the "good" standards organizations like the IETF frequently produce "camels" like the IPSEC key management fiasco.

I think trying to make sense of systems at this macroscopic level is an important area where computer science research is falling down on the job. Similarly, things in the proprietary world get pushed around by short-sighted economic forces. The integration of complete systems is an area where free software is actually thriving. The fact that the Linux kernel supports most hardware available today is actually qualitatively different than the kind of work done to create the Unix operating system, or for that matter Plan 9. It's not just a matter of a lot of scut-work to write all the drivers.

There's a lot of work that desperately needs to be done to make things Just Work, but free software is about the only place that work is being done right now.

unix emulation libraries: a catch-22, posted 3 May 2000 at 01:28 UTC by graydon » (Master)

I can no longer count the number of times people have told me berlin will never ever be anything more than a curiosity unless it has an X compatibility library. if we do get around to it, I will have exactly zero patience for anyone who tries to slam it for being "just a copy of X". writing a library which makes you compatible with something old does not make your work "just a clone" of the old system.

the examples I cited are innovative because they embody research and experimentation in a "systems" area which the "Unix, sockets and C" holy trinity has had some difficulty with, be that fine grained nested security, application control over hardware, transparent clustering, process migration, provability, redundancy, self-repair and self-organization, scalable distributed computation, realtime, or higher order abstractions.

I respect pike too, as a researcher, but I think he's mistaken on this one.

The Thrill is Gone..., posted 3 May 2000 at 02:05 UTC by Uruk » (Apprentice)

It seems that in every field, every once in a while, somebody says "The Thrill is Gone!" and talks about how innovation has slowed down or become stagnant. Everything is just a "variation on UNIX", or everything is this, everything is that.

New ideas NEVER stop happening. Innovation NEVER stops happening. Sometimes it appears to slow, but as they say, hindsight is 20/20 - you'll never know what it is that's going on right NOW that is absolutely astonishing until it either gets rolled into something truly amazing or used as a stepping stone to something bigger and better.

I think this guy is spouting a lot of very common cynicism. I don't blame him or think he's stupid, my gut reaction to the whole thing is, "let's wait and see...".

The fall or OS research, posted 4 May 2000 at 19:14 UTC by Zaitcev » (Master)

Graydon is trying to put a happy face to things but in my view a decay to systems research is truly afoot, which is evident from desintegration of structures, such as IEEE TCOS. Bos went so far as to whine bitterly about that in comp.arch of all places, only because nobody reads comp.os.research anymore.

Rob is surely a victim of the Plan-9 failure. It must be hard for him to see this marvelous piece of ingenuity go down the drain together with Amoeba. He is full of CS preconceptions too: dismissal of Linux, "PhDs using ... Emacs and TeX".

Yet I believe there is a certain truth in what he wrote. "Is anyone outside the research field paying attention?" I guess not and that invalidates the list of alive and happy projects that Graydon presented.

its just the numbers, posted 5 May 2000 at 10:07 UTC by Netdancer » (Journeyer)

I was very surprised to read Pike's paper because his older stuff is much more positive. Anyways, I think that he is dead wrong.

Yes, of course, we don't see fundamentally new things in the mainstream. No surprise there, thats what mainstream is about. The difference to the 70s and 80s is that mainstream is dominant now, whereas earlier the mind-share of academia was much bigger.

So, basically systems research did not grow at the same pace that the rest of the computing world did. Is that a problem? I don't think so. The majority of the really important systems problems has been solved. Maybe not perfect but good enough! Higher-Level problems can be tackled now and thats where the interest is. After all, for the majority, its about getting something done, not about shaking the foundations frequently.

I also don't agree that nobody is noting the systems research thats still happening. Recently, on a rather mainstream security auditing list, EROS OS was discussed and it generated quite a bit of interest.

To summarize my position: Largely, what Pike calls a decline is more a matter of perception and of priorities. Interesting research is still happening and is having an influence. It could definetely be better -- but what couldn't?

Last, but not least, despite all the counter-arguments above: A big thumbs-up for Pikes effort. He probably succeeded in generating a bit more interest about systems research, at least here on Advogato and probably somewhere else.

good enough, isn't, posted 6 May 2000 at 03:50 UTC by Ankh » (Master)

The trouble with being stasfied by things that are good enough is that you get stuck in a local maxima, you miss out on the solution that's 1,000times better. For many people EDLIN and BASIC were once good enough, and before that the teletype was the state of the art in user interface design.

One of the best UIs I've seen used a pen and a graphics tablet, but it got overtaken by the (cheaper) mouse.

Rob Pike is being provocative, of course, but he is also right in many ways.

Graydon, I think that an X compatibility layer of some sort for Berlin would probably see Berlin get many more developers very quickly. I know I for one haven't tried Berlin out because I don't want the hassle of having to quit it to run almost any of the programs I use on an hourly basis.

NeWS had the same problem: without X compatibility, no-one listened. Given a mediocre X compatibility, it suddenly because interesting to a lot of people. Of course, the X compatibility also helped to kill it, because people coded to X and not to the technically superior NeWS API, and that's always a risk.

The Unix legacy is a common ground, but it's become more, and hence, less, than that. Unix was small in V6 days, you ucould run it with 128K of RAM on your PDP-11 (separate I/D space, 2*64K). Compatibility with that is not so hard; compatibiliy with POSIX and all of a "modern" Unix is another matter.

Unix hs grown, but not always with the same quality of insight that its inventors had. Are we stuck in a local maxima, with something convenient and comfortable but not as good as it could be?

Systems research isn't just going on in the academic community any more., posted 6 May 2000 at 22:24 UTC by argent » (Master)

So of course it seems to be a victim of the open source world (including Linux), but that's also an opportunity.

If Pike wants to see Plan 9 be something other than an irrelevant sidebar like Sprite or Amoeba, there's an easy way to make that happen. As I noted in the Usenet thread on the same subject... bringing Plan 9 into the Open Source domain would change things dramatically.

Classical & Quantum mechanics, posted 8 May 2000 at 20:06 UTC by tjl » (Journeyer)

Actually, the situation in CS is beginning to resemble what was happening to classical mechanics in the end of the last (oops, the 19th, the last century is now the 20th!), century: things started really stagnating since all the things seemed to be known.

Then quantum mechanics was discovered. Total paradigm shift. The field was completely rejuvenated.

No, I'm not someone who's going to claim that quantum computing is the panacea and will solve all problems. It's not. I took a careful look at it when it was new and decided that it isn't going to scale enough (up to hundred or so bits) to be interesting. What is needed, like Pike says, is new user interfaces, new concepts and so on.

I currently spend just about all my time working on a project that could be a similar paradigm shift for computing: GZigZag. To me, it solves most of Pike's problems. Naturally, this is a highly subjective opinion but OTOH, I'm standing behind my words by actually putting in a lot of work to get this thing working.

The divorce of CS and business, posted 8 May 2000 at 20:44 UTC by amk » (Master)

I think another contributing cause is that the commercial world and the CS world have almost no contact. Remember when Byte would do a special issue on something like Smalltalk? It brought an academic idea into wider currency, so that at least people could hear about concepts such as OO. Today, when I flip through an ACM journal, and then through PC Week, the two worlds are completely separate; the academic systems don't register on the radar of commercial people at all.

Here I have hopes that free software can help. If you write a new distributed filesystem, or UI concept, or whatever, it provides some slight chance that it will be picked up and used -- for real applications -- instead of just providing material for a paper and then gathering dust forever more.

Academia caused a lot of the problems, posted 12 May 2000 at 02:23 UTC by alan » (Master)

The lack of source code to study because academics happily gave away their rights to source code is one of the key things that caused the end of real academic computing. You learn to build good systems by seeing good systems. The value of Minix and now Linux as teaching tools is huge. Unfortunately a lot of universities see java, visual basic, nt paths and don't teach by example of good code.

I agree with Rob on Linux. Our goal is to produce a stable production OS. People are doing real research with Linux but not integral to it. One of my hopes is this will end the sad state of OS research now where you have to reimplement big chunks of 'every other OS' in order to do your one small new idea well to prove the point then throw it away. With free software you can do research for Linux or *BSD and eventually fold it back in.

Linux builds on research but most of the good stuff we are working from is 1980's work that still hasn't been turned into real product.

And IMHO Hurd is real OS research.

Alan

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

X
Share this page