The Great Porting Adventure: Day 5
January 11, 2012 2 Comments
Shorter update today, as I didn’t spend as much time working on the port yesterday. Things are still coming along, but I’m not there yet.
First off, I did a bit of testing to see what was already working. It seems like everything around saving and loading is functional, making use of the user directory seamlessly. Switching to full-screen also seems to Just Work, though I bet the Alt-Tab equivalent on Mac will need some handling. The list of supported resolutions isn’t being populated, so I’ll need to address that eventually. My use of render targets to perform a screen swirl effect also works nicely. I was initially surprised, but then I remembered that I use render targets for drawing anything in my game (to handle resolution-scaling easily), so it was clearly working already.
I then decided to track down the “jitter” that was occurring in my physics. The player character would constantly be moving slightly above ground and then landing, causing a sound and a puff of smoke several times a second. The likely culprit was my newly added handling in the physics manager’s step function that was chewing through updates in increments of 1/60f. It turned out that some rounding was causing updates of 1/60f followed immediately by another update with a very tiny time delta to pick up the remaining time in the frame. Evidently the frame time being passed in was just slightly larger than 1/60f. This tiny extra update wasn’t enough for the player to “fall” all the way to the ground, so his flags were updated to say he was off the ground and hence he would “land” the next frame. Ugly! But also an easy fix (and my fault – nothing to do with MonoGame).
Next up was VSync. I use vsync, but it wasn’t taking effect on the port. It turns out that vsync is just not implemented in MonoGame yet. I’m not sure how that’s an option for anybody – the screen tearing is horrible without it in my game. I spent a little while searching through the code and reading up on vsync in OpenGL and OpenTK, but it’s a problem I don’t actually know how to solve yet. I’ll dig into the Update/Draw/Present loop today I think – my game is probably pegging the CPU at 100% now anyway.
The next glaring issue I tackled is shown below:
Something was going a bit wrong with the way my textured tiles were being drawn, but only when the camera was in certain positions. I use Point (or NEAREST in OpenGL terms) texture filtering, so being “a little bit off” on texture coordinates causes the completely wrong pixel to be drawn. The pixels that are drawn are a consequence of the spritesheet I use – you’re just seeing the nearest adjacent pixel (which belongs to another tile type).
After ensuring that the correct sampling and filtering was being used, I used a debugger to verify that my vertices were being sent to the device with the correct texture coordinates. They all seemed correct, but they were also right on the boundary of the… texel (I think that’s the term I want) that I wanted to use. It seemed like occasionally the Nearest filter would pick the pixel just slightly to the left of the coordinate rather than the one just slightly to the right. It feels like I need a half-texel offset or something so that I would be sampling in the middle of the texel rather than the boundary. Regardless, it’s a bit beyond my understanding and the hacky fix was just to add a tiny constant offset to the texture coordinate, which did the trick. Sometimes the “right” fix is the one that works.
Now that the game looks more or less correct (minus vsync) and plays properly, I decided it was time to see just how broken XACT under MonoGame really was. I had seen this video showing an AppHub sample running with XACT sound on MonoGame, so I knew there was a least some potential. The first roadblock was that xWMA files are not supported. I use this for my music, as it saves me something like 95% filesize versus PCM. For giggles, I rebuilt my XACT project using all PCM format (for a nice 20x increase in filesize) and loaded it up. I spent another while figuring out that other things are not supported, such as using Volume/Pitch Variance and RPCs like DistanceAttenuation. Categories are also missing. Those are pretty much my main reasons for using XACT in the first place.
What’s more, the reading of XACT soundbanks/wavebanks is essentially black magic. None of the file format is documented, so the whole thing is a lot of guess work and tedious file analysis. It took a fair amount of work just to safely ignore the unsupported features I was using in order to have the basic sound effects work. I don’t see myself reverse engineering the binary format to get those features working.
To that end, I’ve decided to give the SoundEffect approach a shot. Taking a page out of computer science’s handbook, I added another layer of indirection to my audio system and can now swap out the XACT implementation for a SoundEffect-based implementation without my game being any wiser. But I still want to retain some of the nifty features, like volume/pitch variance. And I don’t want to have to maintain a separate list of .wavs to load. Fortunately, it looks like the .xap file for an XACT project contains a human-readable description of all the waves, sounds and cues in the project. If can process that at build time to generate some SoundEffect wrapper objects… hmmm… a new project for today.
That’s where things stand for now. I’m going to continue working at a replacement audio implementation. I’m also going to look into vsync – if anyone has any experience here, I’d love to hear from you.
Running total: 5 days, $112.87