Once upon a time in the not-too-distant past,
a hacker I know blogged about
using object-oriented C
to implement
a lightweight imitation of some of C++'s features
for his latest project;
almost immediately,
somebody saw fit to reward this charming piece
of acceptably self-congratulatory writing
with a stern and quite public deconstruction.
Does this scene seem familiar?
Why does this keep happening?
And what, if anything, can we do about it?
We can hardly hope to appease
all of hackerdom's malcontent
—
but we can at least try to avoid
stepping on each other's toes.
Accordingly,
this article will waste no time
on a platitudinous condemnation
of the surf-by put-down;
rather,
I wish to take a moment to reflect
on what the hacker did to earn it,
and to consider what he might have done
(or, rather, not done)
instead.
Let me start by stating for the record
that I think
the hacker's solution was meritorious as such;
that is, it solved the problem he formulated,
so I approve of it
on the grounds of its workingness.
The techniques employed in
this
approach
are simple and effective,
and were considered
best-practice
C in the eighties.
Some people found this style long-winded, though,
and facilities were soon devised to address
the awkward verbosity of the incipient
object-oriented C
idioms.
The pre-processors that implemented
these and other improvements to C
went on to become distinct languages
(namely, C++ and Objective C)
and communities formed around them.
Or perhaps it was the other way around,
says the sociologist looking over my shoulder
—
but that is neither here nor there for our purposes.
The point I want to make is this:
one can reasonably expect
the people who embraced these solutions
to wince when someone promotes the use of
the very idioms their community has
so laboriously abstracted away.
Did the hacker really not anticipate this response?
If he did,
why did he preface the presentation of his solution
with an explicit rejection of C++?
Whatever his motivation was,
it's fair to say he paid for it.
Indeed,
justifying one's work
by scorning a mature system
on which one is not an expert
works against the legitimate goal
of being lauded for one's skills.
It puts people off,
and will only earn one points with the weenies who happen
to agree.
Moreover, it is likely to elicit
overt retaliation from vigilantes
(mostly the weenies who happen to disagree)
and a spot on someone's idiot list.
Indeed,
in the aftermath of such a gaffe,
any subsequent plea
aimed at demonstrating compunction and competence
– however unassailable and incontrovertible –
is likely to be
shrugged off as so much prejudiced ignorance
and
answered with a curt dismissal.
So here's my twopence-worth of unsolicited advice for
programmers who desire
the recognition of their peers:
share your wins (and your FAIL)
and toot your own horn if you like,
but try not to steal the wind from another's sails
when you do.
Because there's
more than enough hot air to go around already
—
certainly enough for every foghorn blowing at C.
there's a technique i saw rusty russell deploy, once, and it's brilliant. on receipt of a message which was clearly asking for trouble, rusty wrote:
"i think you mean to be asking xyz, so what i am going to do is to rewrite your comments, and your question, in a less inflammatory manner, and then i'm going to answer that".
which of course _immediately_ takes the sting out of any nasty undertones that the OP did not have the good sense to remove, and, just as importantly, filters out the _useful_ bits from whatever mad, bad world the OP lives in, and carries on from there - in a productive manner.
we're human.
not only are we human, but also, we're not all _paid_ to work on free software. so, in many cases, there is no financial incentive, the removal of which could normally be used to threaten someone - who isn't an employee, contractor or sponsoree - to get them to stop [boss: "if you make us look bad by sending messages like that again, we'll fire you"].
[translation being put very bluntly: the free software people being _paid_ to work on free software have it nice and easy]
so it's not even a two-way street, it's a multi-way street, where there are more goals being stated than there are people in the project. some people like to blow their own horns as _well_ as contribute to the improvement of a free software project. that's fine... as long as everyone else agrees that horn-blowing is useful!
I wish you could supply a link to the 'deconstruction'. I wonder if it
has merit.
I read the paper in question. You *could* criticize such a system. For
instance, some of the code is not to my taste in terms of style.
However, we could say that about most of the code I see. From that
author's point of view, they were likely adhering to standards they know.
[As an aside for those who are wondering, there are things like this:
if (condition)
do_something();
else
do_somethingelse();
The missing braces are a constant source of bugs. The above should be:
if (condition) {
do_something();
}
else {
do_somethingelse();
}
I have been coding C for a quarter century. In the field, breaking the
above simple rule of thumb creates bugs. They occur when people add
another line and expect it to work as if it had braces. This just
happens. You can argue all you like, but the moment you leave those
braces out, you invite Murphy in the door. As you know, Murphy is
legally bound to enter. ]
Quite the aside...
Anyway, I found the work charming as an exercise. I agree with the
above: I too, "approve of it on the grounds of its workingness".
The backhanded compliment "considered best-practice C in the eighties"
ruffled my feathers a bit. Principles of abstraction, data hiding,
modular construction, re-entrant design, 'sensible names' (I must
comment on that below), etc are just as important now as ever and broken
these days, it seems, more often than ever. The proof: we see bugs,
cruft and fragility all over. I wrote my oft-mentioned base64 code
because every other implementation either had bugs or was simply not
portable for any practical purpose.
In the work referenced above, the author says:
"The techniques described here grew out of my disenchantment with C++
when I needed object-oriented techniques to implement an interactive
programming language and realized that I could not forge a portable
implementation in C++. I turned to what I knew, ANSI-C, and I was
perfectly able to do what I had to."
You could argue with that, but you would be on shaky ground. For code
that stands the test of time, ANSI-C is one of your *ONLY* options. I
recently re-compiled something I wrote in ANSI-C in the 1980's. It
issued a shower of warnings, because ANSI-C was not entirely stable back
then. It still ran, though, and it was easy to clean it up. I wrote code
in Modula2, various dialects of PASCAL (notably TurboPascal3), many
dialects of BASIC, Clipper (ugh), C++, x86 Assembler, 6502 Assembler,
forth, Fortran, PL/1, etc. The only programs that compile and run on my
current machine are ANSI-C (ish). Assembly code still compiles and *can*
run, but I must find MASM and a linker to do it. Java, of course, did
not even exist back then. However, any non-trivial Java program from my
first days programming in Java will not work without alteration. Java
seems still to be shifting sand. If I had to bet what would live the
longest, I would still pick ANSI-C. [I can't help but throw in a pitch
for Sybase/MSFT Transact-SQL here. My old non-trivial SQL stored
procedures won't work without alteration, but they are easy to port
forward]
Of course, I have played with any language I could get my hands on. Most
of the 'latest and greatest' are part vaporware and/or don't even work
for trivial code. Some example code does not even compile and some that
compiles does not work.
ANSI-C is still often the best choice to build working, maintainable and
portable code that has a chance of living for long enough to pay for
itself. Assembly code is just too 'un-portable' and awkward. Except in
the hands of experts, ANSI-C produces code just as good as or better
than hand-coded assembly does. If I had to pick an OOP, I would
(reluctantly) go with C++. However, even C++ is a poor option if you can
get away with vanilla ANSI-C. [Note that I am well aware of C's failings
as an OOPL, but C++ is a gruesome fix.]
To paraphrase Einstein, the language you use should be as simple as
possible, but no simpler. Many programs in daily use gain nothing from
coding in an idiom more abstract than 'simple' procedural programming.
Laughably, a good portion of the 'object-oriented' code I see is just
bad procedural programming anyway. All the 'OO' language bought them is
torturous indirection.
There are times, when dealing with things like a GUI, where ANSI-C does
not have what you need to manage things. At that point, the options are
not that good. I can see why someone with the ability would just throw
in the towel and roll their own.
It is funny a little, but also scary how blithely people dismiss ANSI-C
and present their silly special case or non-working alternatives as
superior. A vanilla ANSI-C program will compile anywhere there is an
ANSI-C compiler. I am not aware of a non-trivial operating environment
that does not have an ANSI-C compiler available. In fact, if you include
the operating system, most tool-chains have at least some parts that are
in C.
I have coded (still do) in many languages. Most have their charms. Even
COBOL has its place. The work in question may be ill advised, but it may
also be the best thing for the stated use. It depends. As a body of
work, it appears thoughtful. C++ *can* often be fragile, non-portable
and vulnerable to pernicious bugs. I accept their assertion that they
chose the best course for their needs.
It is sensible to choose the language best suited for a given purpose.
To hack a quick GUI under Windows, I use VB. Why? Because it is easy, it
works and it continues to work. To hack a console tool I tend to go with
C, but sometimes I use Perl or other shell scripting because it is the
best choice. For server side scripting, I most often use PHP, but again,
other things are sometimes called for. Sometimes, Java is best for both
front-end *and* back-end scripts. I have not needed it, but I can see
how functional stuff like Scheme would be best.
Software is a tough business. Other programmers can be a tough crowd.
Someone will always criticize what you do. Coders wedded to a given
paradigm can see things outside of it with a jaundiced eye. They usually
don't have enough time at the keyboard to render a sound opinion. [It
takes about a decade 'heads down' at the keyboard -- ten or twenty
thousand hours to become expert.] In some cases, though, they are right,
even if their reasons are wrong. That might be true in this case.
Someone is trying to use an object-oriented paradigm in a procedural
language. It lacks the native 'syntactic sugar' to make OOP simple.
However, it could be true that the right target OOP tools do not exist.
Despite the hype of the (admittedly literate) Java folk, it is hard to
make practical, working, non-trivial code in that language that will
stand the test of time across platforms. If, say, you want to make a
stand-alone console filter like 'grep', Java is a poor choice.
As I said, this is a tough crowd. No matter what you do, you will always
draw criticism. Sometimes, it seems inversely proportional to the skill
of the critic.
Sadly, for someone to be good enough to sweat the details, they have to
take pride in their work. Sometimes, that can look arrogant and you
might want to knock them down a peg. However, earnest programmers charm
me, even when they take a bad turn. As they mature, they will become
more humble, as will their critics. Meantime, you just have to suck it
up and take the criticism with a grain of salt.
[Re: 'sensible names' -- There are a host of 'rules of thumb' to create
sensible names in software. I will mention one here that 'bugs' me:
names and distance. Declarations should be as proximal as possible. That
is, they should be declared within the narrowest scope within which they
can operate. In a language like C, that means within the closest block
defined by braces. Distance as I meant it is composed of a few things.
One is scoping as the above. Another is physical distance -- the largest
number of lines between declaration and use. Another is temporal -- the
largest amount of time (allowing code drift for instance) between the
creation of the name and its retirement. Another is conceptual -- the
more difficult or unfamiliar the concept, the more distal it is. Names
should generally start short and get progressively longer and more
meaningful with distance. For counters, i, j and k are used by
convention in many languages. There is no reason to waste time (and make
unnecessary presumptions) with a name like
register_integer_local_counter_one. If the reader needs that scaffolding
they won't be able to understand the code anyway. On the other hand, it
would not be appropriate to use the variable 's' as a pointer to a
global structure. That should take a name like
'Ill_Advised_Pointer_To_Presumptive_Global_Screen_Handle_Refactor_Me'.
I'm using poetic license there, but you get the idea.]
As always, you may now "Flame On".