Wednesday, January 28, 2009

Patent Fending

From wikipedia:

"At present, the (US patent) files, are so extraordinarily complex and the items so multitudinous that a veritable army of governmental servants is required to attend them and sort them into some order of distinguishable categories to which reference may be made when corresponding with patent applicants for the purposes of examiner citation of "prior art" disclosure. This complexity makes it inevitable that the human-equation involved in government servants relative to carelessness or mechanical limitations should occasion the granting of multitudes of "probably" invalid patent claims."

Funny thing? That is a quote from 1938.

WWWTP

I like games. You probably like games as well if you are reading this blog.

Let's play a game! I call it “What's Wrong With This Picture?”

Friday, January 9, 2009

Bad ACS, no Twinkie

So I've gotten back into Doom modding again, and during the process I have had to pick up ACS again. After spending over an hour trying to print a single sentence to screen (I'll get to that in a second) and having Doom Builder crash Yet Again, losing all of my progress, I figured I needed a break to blow off steam.

So why did it take me so long to print a message to the screen? Modularity. The mod I am making is going to have a lot of text printed to the screen, and in realizing this up front I decided to go ahead and write a function that would center it and compute the duration (how long the text should be displayed) based on string length to save me some typing. There is a rant in here alright, but there are also some interesting programming issues that I want to touch on lightly. For example, how do we balance planning ahead and building foundations with actually getting shit done? There, that was lightly, wasn't it?

Basically, I wanted to go from typing this:

hudMessage(s:"mah message"; HUDMSG_PLAIN | HUDMSG_LOG, 0, CR_GRAY, 0.5, 0.5, duration);

to typing this:

pcThink("mah message", duration)

and have the 'duration' variable be intelligently calculated (with the option to explicitly set it in cases where the default algorithm generates suboptimal results). Thus I set out to write the pcThink function. After several iterations, I ended up with the following:

function int pcThink(str thought, int duration)
{
// a value of 0 for the duration means the caller wants
// us to figure it out for ourselves
    if (duration == 0.0)
    {
        duration = strLen(thought) * 65536;
        duration = duration / 10.0 + 1.0;
    }
    hudMessage(s:thought; HUDMSG_PLAIN | HUDMSG_LOG, 0,
       CR_GRAY, 0.5, 0.5, duration);
// return our duration in tics so it can be used in
// delay() calls
    return duration * 35 / 65536;
}


If you can spot the mistake, then maybe you deserve a twinkie. ACS does not. I'll give you a hint: the symptom of the bug was that my messages were not lasting long enough. And no, there were no compile errors.

Okay, maybe you noticed my liberal mix of floats (they're actually fixed point values in ACS) and ints. duration is, after all, declared as an int, and I am multiplying it by 65536 in one place then diving it by 10.0 in another. Am I crazy? Have my rugged good looks finally gotten the best of my mental facilities? Perhaps - nay, likely. But that is not the cause of my problems in this instance, though you would be looking in the right direction.

hudMessage takes a fixed point value for its last parameter, which represents how long the message is supposed to be displayed to the screen. I am passing in duration here, which was declared as an int, but like any good language ACS coerces ints to floats (fixeds?) and vice versa, which all sounds keen and gives one a warm fuzzy feeling, right? Except that this coercion is done in a manner described most accurately as despicable. ints in ACS are 4 byte storage containers, pretty typical. Fixed point values do not technically exist in that there is no fixed keyword like there is an "int" keyword, though there are fixed point literals which can be illustrated by the following declaration:

int wolfInSheepsClothing = 50.0;

The way this is stored is that the upper 2 bytes represent the decimal value (things to the right of the period) while the lower 2 bytes represent the integral value (things to the left of the period). This is atrocious! 1.0 does not equal 1 in this system, and not just because it actually equals 1.0000001. No, because 1.0 actually equals 65536! In Soviet ACS, 1.0 + 1 equals 65537! I'm not even kidding! Who the hell thought this was a good idea?! The error in my above code is where I am dividing by 10.0, intending to increase the duration by one second for every ten characters in the message, but I instead increase the duration by one second for every 6554 characters!

Other gripes with ACS include dividing time into units of 35 tics per second (I'm sure there's a good reason...) and converting strings to ints when using the plus operator instead of concatenating or generating a compile error. ACS is actually a pretty good language for what it's meant to do and provides script parallelization that is easy to rationalize.

I am getting sick of goofy language design decisions pandering to low-level needs slowing down my high-level development. We need better abstractions that fit with our taught domain knowledge of mathematics and text so we can write "x = 1 + 2.0" without worrying about coercion tricks while "x = "foo" + 7" or "if (someIntegerValue)" will not compile. And don't get me started about pointer arithmetic. Also, why on earth have units not become more prevalent? I might not remember if delay(int) expects tics or seconds, but if I am allowed to type "delay(70 tics)" or "delay(2 seconds)" at my whim, I don't have to remember. And having static unit analysis at compile time could avoid certain embarassing blunders.

Oh, and indexing starting at zero! Dijkstra be damned, I still find this to be a counter intuitive (and thus incorrect) language design decision after over a decade of programming with it, though for the sake of brevity I will save the details for another post.

The problem as I see it is that programming language decisions were originally made based on memory limitations and thinking too closely to the machine. We have not evolved enough past these tarpits. We don't have enough, or any, "High Enough Level" languages. You've probably heard the cobbler's children analogy applied to programming, particularly to language design. If we were shoe makers, we would be hammering horseshoes on our kids' feet if nails were cheaper than stitches. Swear to God.

Tuesday, January 6, 2009

Appreciation of Music in Games

It happened again at lunch today. I was talking about Castlevania: Symphony of the Night with a coworker, and, unprompted, one of the first things they brought up was how good they remembered the music being. A couple of other people have said the same thing, and I have even read a preview comparing the music in our game to the music in SotN (in quality, definitely not in style). Unlike me, these people played the game when it came out (XBLA gave me a second chance to redeem myself), and they still find the music memorable. Trouble is, I can't for the life of me remember a single track from beginning to end, and I have just recently beaten the game a second time to get 200.6% and once more with Richter to get the full 200 achievement points. And then some to get 99 of everything the librarian sells including duplicators (I used the sword familiar glitch, but it still took a while). And then some more to grind up some levels. Fantastic control scheme, brilliant use of gameplay mechanics to implicitly branch the story, and support for a variety of play styles make this game one of the best of the PSX era. But wait, there was music playing in the background? I do remember some area had music that sounded like it would be better on an elevator than in a Castlevania, and the ending theme was far more emotional than the rest of the game tried to evoke, but the remainder of the tracks have not managed to stick in my memory.

I have begun to wonder if I am not giving enough attention to the aural aspect of game consumption. If ten years from now someone was to bring up SotN in a conversation, the first things I would laud would be the castle design and the risky decision to put so much effort into implementing and not advertising the second half of the game. The background tracks, while fitting and mostly just as polished as the game, would not even come to mind. But my experience has shown me that Symphony's music is in fact one of the first things people remember about the game. If you have played this title, what do you think? Is the music memorable to you? Is it as good as the gameplay or better?

That out of the way, there are a few titles that show me that I can notice and appreciate music in games. Eternal Sonata immediately comes to mind, as do many older JRPGs (almost any FF, Chrono Trigger, Lunar, Xenogears - pretty much anything done by Uematsu, Mitsuda, or Iwadare). I recently started playing Banjo Kazooie and marvelled at how the music changes dynamically from an ominous version to a softer version of the same tune when you approach the door to a world, or when you transition underwater. I even found myself relaxing recently to the title screen theme to the XBLA port of Uno when I left it running in the background. Conversely, when I played the original Castlevania again recently, I was cringing at how grating the three-channel NES tracks were. We put up with a lot back in the day...

How about you? What games have soundtracks that you remember years later (or will likely remember years from now if they are recent)? I would have to give mad props to the original Nights for having some of the most original, uplifting music in video game history (the sequel does a good job in this area as well) and to Doom64 for trading pop melody for darker atmospheric tracks (best in the series, imo). Another interesting question: is it possible for game music to be "too good" - that is, to be so outstanding that you are pulled out of the immersion and start appreciating it for its own merits? If this occurs, is it a good thing for the game as a whole?