cave success

Yesterday I'd started to feel like I don't have superpowers -- for a week now I've been working on documentation for ChemPad, not writing any code. So I devoted the afternoon to therapeutic programming on Screen: programming that makes me feel good. I had a fantastic day in the Cave: I made text swarm around the user's head, in a path determined by a bunch of sine waves with different frequencies and amplitudes... I could last about ten seconds in VR with words swimming around me before I started to feel very dizzy. That's actually a good thing, in this case, because the point is to overwhelm the user with text. At Josh's suggestion, I tried it out with extruded polygonal text instead of texture-mapped, antialiased, alpha-blended text. Damn, it looks good. Poor Josh brought it up like, "This is just blue-sky stuff, idea off the top of my head, I know it would probably be really hard to do, but it would be cool if the text had some depth." Aha! That's just an option in FTGL. I only had 15 minutes or so to get it to work before we had to go meet friends for Battlestar Galactica, and I couldn't find the depth control. I came back to the cave later that night, after three or four episodes of BG:TNG, and got extruded text to work very quickly. I turned on the hardware-supported full-screen-antialiasing, and now it looks better than the texture-mapped alpha blended text, and I don't have to do z-sorting.

Yesterday's coding demonstrated that the toolkit I'd spent my weekends in January constructing actually does enable rapid design and implementation of spatial text applications for virtual reality. Yeah!!!