Friday, January 9, 2009

Bad ACS, no Twinkie

So I've gotten back into Doom modding again, and during the process I have had to pick up ACS again. After spending over an hour trying to print a single sentence to screen (I'll get to that in a second) and having Doom Builder crash Yet Again, losing all of my progress, I figured I needed a break to blow off steam.

So why did it take me so long to print a message to the screen? Modularity. The mod I am making is going to have a lot of text printed to the screen, and in realizing this up front I decided to go ahead and write a function that would center it and compute the duration (how long the text should be displayed) based on string length to save me some typing. There is a rant in here alright, but there are also some interesting programming issues that I want to touch on lightly. For example, how do we balance planning ahead and building foundations with actually getting shit done? There, that was lightly, wasn't it?

Basically, I wanted to go from typing this:

hudMessage(s:"mah message"; HUDMSG_PLAIN | HUDMSG_LOG, 0, CR_GRAY, 0.5, 0.5, duration);

to typing this:

pcThink("mah message", duration)

and have the 'duration' variable be intelligently calculated (with the option to explicitly set it in cases where the default algorithm generates suboptimal results). Thus I set out to write the pcThink function. After several iterations, I ended up with the following:

function int pcThink(str thought, int duration)
{
// a value of 0 for the duration means the caller wants
// us to figure it out for ourselves
    if (duration == 0.0)
    {
        duration = strLen(thought) * 65536;
        duration = duration / 10.0 + 1.0;
    }
    hudMessage(s:thought; HUDMSG_PLAIN | HUDMSG_LOG, 0,
       CR_GRAY, 0.5, 0.5, duration);
// return our duration in tics so it can be used in
// delay() calls
    return duration * 35 / 65536;
}


If you can spot the mistake, then maybe you deserve a twinkie. ACS does not. I'll give you a hint: the symptom of the bug was that my messages were not lasting long enough. And no, there were no compile errors.

Okay, maybe you noticed my liberal mix of floats (they're actually fixed point values in ACS) and ints. duration is, after all, declared as an int, and I am multiplying it by 65536 in one place then diving it by 10.0 in another. Am I crazy? Have my rugged good looks finally gotten the best of my mental facilities? Perhaps - nay, likely. But that is not the cause of my problems in this instance, though you would be looking in the right direction.

hudMessage takes a fixed point value for its last parameter, which represents how long the message is supposed to be displayed to the screen. I am passing in duration here, which was declared as an int, but like any good language ACS coerces ints to floats (fixeds?) and vice versa, which all sounds keen and gives one a warm fuzzy feeling, right? Except that this coercion is done in a manner described most accurately as despicable. ints in ACS are 4 byte storage containers, pretty typical. Fixed point values do not technically exist in that there is no fixed keyword like there is an "int" keyword, though there are fixed point literals which can be illustrated by the following declaration:

int wolfInSheepsClothing = 50.0;

The way this is stored is that the upper 2 bytes represent the decimal value (things to the right of the period) while the lower 2 bytes represent the integral value (things to the left of the period). This is atrocious! 1.0 does not equal 1 in this system, and not just because it actually equals 1.0000001. No, because 1.0 actually equals 65536! In Soviet ACS, 1.0 + 1 equals 65537! I'm not even kidding! Who the hell thought this was a good idea?! The error in my above code is where I am dividing by 10.0, intending to increase the duration by one second for every ten characters in the message, but I instead increase the duration by one second for every 6554 characters!

Other gripes with ACS include dividing time into units of 35 tics per second (I'm sure there's a good reason...) and converting strings to ints when using the plus operator instead of concatenating or generating a compile error. ACS is actually a pretty good language for what it's meant to do and provides script parallelization that is easy to rationalize.

I am getting sick of goofy language design decisions pandering to low-level needs slowing down my high-level development. We need better abstractions that fit with our taught domain knowledge of mathematics and text so we can write "x = 1 + 2.0" without worrying about coercion tricks while "x = "foo" + 7" or "if (someIntegerValue)" will not compile. And don't get me started about pointer arithmetic. Also, why on earth have units not become more prevalent? I might not remember if delay(int) expects tics or seconds, but if I am allowed to type "delay(70 tics)" or "delay(2 seconds)" at my whim, I don't have to remember. And having static unit analysis at compile time could avoid certain embarassing blunders.

Oh, and indexing starting at zero! Dijkstra be damned, I still find this to be a counter intuitive (and thus incorrect) language design decision after over a decade of programming with it, though for the sake of brevity I will save the details for another post.

The problem as I see it is that programming language decisions were originally made based on memory limitations and thinking too closely to the machine. We have not evolved enough past these tarpits. We don't have enough, or any, "High Enough Level" languages. You've probably heard the cobbler's children analogy applied to programming, particularly to language design. If we were shoe makers, we would be hammering horseshoes on our kids' feet if nails were cheaper than stitches. Swear to God.

2 comments:

Anonymous said...

It's early in the morning and I have no motivation to address your
points in any particular order.

And no I didn't play your map. I was going to but then some shit
happened. Major shit. In brief, I was involved in a helicopter
chase.

Thank you for the Dijkstra comment. I whole-heartedly agree that
starting indices at zero is asinine. Of course the typical response
is that "you don't get it," but it's not hard to get. Dijkstra's view
of indicing is not revolutionary---just anitquated. I prefer it for
low lewel environments where I must be very aware of the target
machinery, but in any "high-level" situation I see no good reason to
start at zero.

So what languages work with units? The only one I can think of off
the top of my head is TeX---good thing too, I would hate to be able to
only specify my margins in inches. :(

I would have never guessed the issue with ACS, but I did immediately
notice "int duration" and how you used floating point values. As soon
as I saw that I knew what the rest of the update was about. Simply
retarded behavior. Now if you excuse me I have to go work in a
language where 10 + '30 days' is 40. Wee!

eiyukabe said...

helicopter chase...?

I don't know of any languages that have unit support. I'm sure googling would bring up a ton of obscure languages, which is better than nothing though I wish units were more prevalent in languages that we are likely to have to use at work. I read a paper on the subject and it did not seem particularly difficult to implement, so I imagine the lack of popularity is not because of implementation difficulties. I think that units can be added to any language that supports arithmetic, the biggest difficulty being how to fit it into the grammar unambiguously. The tokens that represent units would probably be matched with the typical (_a-zA-Z)(_a-zA-Z0-9)* pattern, so there would be a need for either a token or a guaranteed context so we could differentiate 'minutes' the unit from 'minutes' the (poorly named but syntactically allowable) variable.