The first wave

Posted 10 Aug 2002 at 12:06 UTC by Alleluia Share This

The first wave is ending. The first wave is a generation of programmers, beginning in the nineteen-forties, and ending in the next ten years or so. In the great long history of computing, which will extend hundreds and thousands of years into the future, the preceding sixty years will have a distinct set of circumstances which will never be repeated again, except in echo form. Let's call it a 'wave' as though we are considering an ocean. This wave of programmers came to the labor of programming with nearly one hundred percent passion for adventure; money as a measure of skill was secondary to that originating passion.

In the 1940s, programmers and hardware techs were hardly distinguishable because the systems were so new in concept that programmers were inventors, hardware developers, marketers, and software programmers all in one. As the art of computing advanced, and such basic structural decisions as binary over trinary computing systems were established by the early 1950s, these roles began to separate into different 'fields' within computing. Attendant came mathematical and scientific advancements specializing in information which was targeted to the needs of computing. Finally, large-scale social changes began to take place to make room for computing, and people began to sort themselves into categories.

This was the beginning of the first wave. Computing was driven by people whose passion for computing was primarily one of unabashed adventure, and for the fortunate ones, the passion was accompanied by secondary measurements, like money, careers, social status, and so forth.

This first 70-year 'generation' of programmers, which is segueing into a new generation as I write, carries great numbers of the purest and deepest form of geek which will ever populate the realm of computing. There will always be supergeeks to advance the industry; there is a beautiful purity to the mind and nature of a person who pursues his career for the sake of adventure first, and for money or status second. He is willing and able to take huge leaps into the future with little provocation, and this unpredictibility is necessary for a certain type of growth.

In the future, and increasingly now as these words are written, computing will be driven from a "top-down" direction. Please forgive the terminology here, some readers may consider this analogy to be upside down and are perhaps more correct in doing so. By "top-down" I mean the major decisions which drive computing are increasingly driven by people whose understanding of computing is secondhand; they are managers and accountants, not programmers, inventors, or hardware techs.

This is a bold claim, for it can be argued coherently that computing advancement has always been at the behest of limiting the financial risks involved. In fact, some of the largest computing decisions have come as a result of calculated risks taken by people with purely financial motives, as in AMD creating a sincere competition for Intel back in the day. Yet what I am saying here is considering the fact that few people realized how pervasive computing would become, even into the late 1970s when the basic structure of computing, and even the Internet in rough form, was already finalizing.

Some of the programmers reading these words can still remember first hearing of Steve Jobs, The Woz, Bill Gates, and more quietly, hackers like rms, Clifford Stoll, or, say, CHAOS as young 'kids', energetic and building something independent from industry giants like IBM, the massive steam-engine with momentum from pre-computing days. These kids were pure, building their entire lives around computers the moment it became technically possible to do so, having no momentum but their own enthusiasm. Now a couple decades later their bold leaps have shaped computing in fundamental ways. We still have room for a few more such people in the first wave, and people like Bruce Perens, Eric S. Raymond, Linus Torvalds, CmdrTaco, Larry Wall, Raph Levien, Rasmus Lerdorf, Caesar are examples of how to forge a high-profile career in computing without money as a primary concern.

Yet this first wave is winding down. Time to break the hacker's jargon dictionary into a new edition, for the language is changing. Computing for joy is already being infused with an incoming generation of people who compute "for a living." The standards of joy are precious, holy, awesome to behold in raw form; the standards of "making a living" are safe, stable, calculating every risk by marketing standards rather than technical-adventure feasibility.

Some readers will say that the transition already happened, perhaps in the late eighties or early nineties, but I am specifically wanting to include factors like gaming and the web which have become industry-moving factors, and only took on such strength in the past few years; we are still exploding outward. To analogize from physics, we are still in the first few milliseconds of the Information Big Bang. The great paradigm shift, where "time", "temperature", and "space" become relevant descriptors is now descending.

There are characteristics of a first wave which will never be repeated in such pure form. As geeks, we are continuing to separate out into categories, some of which do not yet exist. Thus by nature of time placement alone, we see the whole thing more "wholistically" than it will ever be known in the future, rare geniuses aside. Unlike the 1950s, when we decided on binary instead of trinary hardware, now it is possible to get a career as a "hardware tech" and barely learn a whit of programming. Whole generations in the future will be devoted to exploring specialized fields within computing which are created within this first wave.

Creation is an eternal process, meaning that there are aspects of it which go further back than we can perceive and extend further into the future than we can contemplate. Every now and then, we come to a great punctuation in the cyclical equilibrium of our destiny. With now a history of 4.5 billion years behind us, and conceivably about the same ahead of us, it should be clear that while there is nothing new under the sun, there is also a whole lot new under the sun.

The question is, to each of us, "What can we do to grab ahold of those pure things which are happening during this moment in the Information Age, to hold them up as standards which future generations will look back upon and savor for their noble character?"

This is important. Being moved by the winds of a revolution is like not being in the revolution; might as well be born in another time if you aren't going to grab ahold of the revolution as it roars along. Let us not fold into the second wave quietly; let us revolutionize computing twelve more times in the next ten years.

Work for free.

What brought a smile to my face..., posted 10 Aug 2002 at 16:46 UTC by CaptainNemo » (Journeyer)

... was the thought of my children having any idea who "CmdrTaco" is :)

Seriously though, the "waves" have been crashing for a very long time, the only reason why the people who are riding the surf of this one are any different then all the other inventors and great minds thoughout history is because they are live now, and we know them and they are our heroes. Next generation will be making their own waves and having their own heroes.

Jobs and Woz are the Gutenbergs of Computing, Gates is the Rockefeller, they will go down in history because of what they accomplished, not because of their "purity".

Huh?, posted 14 Aug 2002 at 15:07 UTC by tapir » (Journeyer)

It's presumptous to suppose that computing is going to have a hundred or thousand year history, unless you mean people counting on their fingers.

People who think about the future of the world system tend to believe that our civilization will face a possibly terminal crisis in the 2040-2050 timeframe, although some believe things could fall apart as early as 2010 to 2020. Although many believed that the resolution of the 1970's watergate/stagflation syndrome invalidated the thesis of "The Limits to Growth", the Limits to Growth generally foresaw a crisis in the 2050 range with a last chance to avert it around 2015.

Inexpensive oil is going to become scarce in the next 20 years. Global warming will curtail our use of other fossil fuels as it ends up having a serious impact on our civilization around the middle of the century. The supply of Uranium 235 is much less than you probably think, and every fast breeder (that uses Uranium 238) has caught on fire, melted down or both.

On top of failure modes caused by running out of resources or doing irreperable harm to the biosphere, we also need to consider cultural failure modes; i.e., between television, video games and people not being willing to pay for schools, we can't produce a literate (never mind literary or numerate) population. Daniel Bell points out that consumer capitalism ultimately creates attitudes (being "cool") that destroy the underpinnings of that system (the "protestant ethic.)


Other than that, the rest is bogus too. There have always been people into programming for the money and always been people with other interests. That isn't changing. In the last 70 years or so we've already had time for multiple "waves" -- people who grew up with batch-mode technology, interactive systems, graphical users interfaces, OO technology, and many other approaches that have changed the way we use computers. Even the "open source" movement started with roots in the 1970's UNIX community and the early 1980's CP/M community.

New Advogato Features

New HTML Parser: The long-awaited libxml2 based HTML parser code is live. It needs further work but already handles most markup better than the original parser.

Keep up with the latest Advogato features by reading the Advogato status blog.

If you're a C programmer with some spare time, take a look at the mod_virgule project page and help us with one of the tasks on the ToDo list!

Share this page