Well, it's been a while...
FreeSCI
Slashdotted! Somehow, I had expected to rise to a higher
level of awareness, or something like that. Well, can't have
everything.
Anyway, we now have sound, tri-linear filtering (plus
something I used to call bi-linear filtering which
definitely isn't), alpha-blending on Alphas (see "Assembly"
below- it's just a feature, so I didn't feel too bad coding
this non-portably), and lots of bug fixes. I guess it's time
to prepare for a feature freeze again...
Exult
Haven't touched that project in a while, but will get back
to it soon- they're gearing up to the next release, so I'd
better make sure cxx works.
Assembly
It's been the first time since roughly two years that I
touched assembly again, and it feels quite weird. The Alpha
instruction set is so much different from the ia32 one I
grew up with. With all of its extensions, it's not quite
as clean as the original MIPS one, but the core concepts are
pretty clean and RISC. I definitely don't mourn my ia32
assembly days.
It's really a pity that we (well, most of us, anyway) are
stuck on the ia32 architecture because of "binary
compatibility"
issues. Shouldn't it be possible to do a complete flow
analysis of a program, transform it into a
device-independant representation (java bytecode
comes to mind, although I'd prefer something more
functional and tree-like... maybe RTL?), and then
re-assemble it on an arbitrary target
device? Sort of a ia32 compiler frontend.
Normal compilers generally have to make some concessions
towards code generation; specifically, they need to be
reasonably fast in order not to slow down the development
cycle (used to be a major problem before harddisks and
compilers with few passes became common- those advances
pretty much killed off research in incremental compilers,
but
that's a different story). However, a cross-assembler would
typically be run on a program that is known to be working
already;
therefore, if the cross-assembler itself was fully
operational, it
would not be required to be run more than once for each
program; consequently, it could be slow as hell (hey, it
should be possible to get Prolog to do that stuff by
back-tracking...). External functions (e.g. DOS: Int 0x10,
0xa000 memory access) would still have to be modelled in
some way, of course, but I don't see why this shouldn't be
possible if we accept a moderate performance loss. Has
anybody heard of a project like this?
Anyway, this is too much work for a project for the
evenings; maybe I should look into this (or the theoretical
aspect of it, at least) for my Master's Thesis...
Work
Yeah, I'm back at work again. XPath, XSLT, Java, some
business buzzwords, and roughly everything in between. This
in itself would be pretty boring, but I just love the work
atmosphere- being a research institute, we have enough time
to think about things before we build them, and our
bosses actually have some clue about the stuff they're doing
(or, at least, are able to admit it and ask for help if they
don't, which appears to be a surprisingly uncommon feature).
Anyway, I got to install Debian/Sparc on a Sparc notebook
(which didn't have working fd support). A rather fun and
enlightening experience, until I got to the point where it
turned out that 'sed' would segfault on complicated stuff,
such as the stuff done in configure scripts. OK, NP, just
grabbed the most recent release from the GNU ftp server, ran
./configure...
D'Oh.
OK, well, maybe it wouldn't be quite that easy. Still, a
manual compile didn't really improve the situation- it
built, but it segfaulted all the same. So I took the BSD sed
and tried to compile that. This was the moment where I
realized that the BSD people care for OS independance in
their system tools about as much as the GNU guys do...
Anyway, it works now. Anyone who wants a copy of the ported
BSD sed, just give me a call.
"Retro gaming"
(Warning: Rant ahead)
That phrase sounds pretty weird to me. It's implication is
that the games it covers are "obsolete" in some way, that
they're more of a historic curiosity than an actual game.
I beg to differ.
Don't get me wrong- I whole-heartedly agree that there are
great games with much better graphics and sound than, say,
Space Quest 3 or Ultima 7, and that some of those games are
actually fun and challenging.
(Of course I've grown somewhat out of touch with the "gaming
community", so I'll just assume that there are any
new games which fit that description...)
Still, how does that obsolete those old games? I mean, chess
or Go are ages old, however someone playing them is not
considered to be "retro". Some games which are only a few
decades in age, though, are looked down upon, people
generally assuming that a WAFF might be the only reason to
re-play them.
I guess this is yet another sign of how much power
advertisement companies, marketing divisions, and mass media
hold over us nowadays. They don't need to convince everyone,
but if they convince enough people, those will convince
others. In this case, they'll convince them that they need
those great new graphics and surround sound, or they're
stupid.
Dealing with this kind of mental enslavement is likely to be
one of the greater challenges of our future (and no, I'm not
just talking about "retro gaming" here...)