The Great Porting Adventure: Day 7

In this episode, our hero does battle with the beasts known as music and vsync… and wins.

Last time, I wrote about how I had successfully transitioned to a SoundEffect-based audio system. This was true, but it was also not the end goal. I was playing my music as a SoundEffect as well, which had the unfortunate requirement of all the source files being in .wav format. This meant that the music in my game was taking up about 170MB of space. Yikes.

There are two ways you deal with this in XNA land: switch to XACT to get fancy xWMA compression (not an option) or use the Song API to load in MP3 or WMA files (which both end up as built WMA files). Simple, right?

So I switched up my audio system to use Songs whenever a music cue was played (and updated the pipeline and content loader accordingly to use Song objects instead of SoundEffects for music cues). Then I remembered some of the drawbacks to the Song API.

You can only have one Song playing at a time, so there’s no way to do cross-fading. What’s more, the Song is controlled through the MediaPlayer class which has ridiculous overhead doing simple things like changing Volume. That means it’s impractical to do fading out manually by adjusting Volume. It’s possible, but you take a big performance hit for something simple (on Xbox anyway).

So I reverted all of those changes and went back to using SoundEffect for music. I realized that you can actually use waves processed with a bit of compression (ADPCM) and still load them as SoundEffect objects. This was great, because it saved something like 70% filesize for “Medium” quality (which sounds fine for the chiptune-style music I have).

Next I had to solve the pesky problem of not being able to play multiple SoundEffectInstances of the same SoundEffect (a limitation of the current build of MonoGame). Fortunately, there is an outstanding pull request on the MonoGame github project that integrates OpenAL support along with a bunch of missing SoundEffect functionality. I had to download and build maccore and monomac from source first, but that was a simple case of cloning those repos and running “make” from inside the monomac folder.

This is a picture of the OpenAL logo, because otherwise this post would be mostly text.

After all that – it didn’t work! Turns out that OpenAL doesn’t seem to support ADPCM wave files, so I’m back to square one as far as music goes.

Or am I? OpenAL can read MP3 files. And the SoundEffect object doesn’t care what’s in its buffer. So I can actually load MP3 files as SoundEffects directly using MonoGame! Vanilla XNA can’t even do that!

I had one more bug where I wasn’t disposing of SoundEffectInstance objects properly (my code), but once that was fixed, everything seemed to be working perfectly. Yay!

Next on the docket was vsync. The screen tearing I was experiencing was unacceptable, so vsync was definitely required. Thankfully, @SoftSavage (the current MonoGame coordinator) pointed me in the right direction with some docs from Apple on the subject. It was a simple case of finding out where to get the OpenGLContext and then set up the SwapInterval. Since I had just built monomac from source, this was relatively easy to track down and then add it to Game.applyChanges() thusly (line 19):

internal void applyChanges()
{
    if (GraphicsDevice.PresentationParameters.IsFullScreen)
        _platform.EnterFullScreen();
    else
        _platform.ExitFullScreen();

    // FIXME: Is this the correct/best way to set the viewport? There
    // are/were several snippets like this through the project.
    var viewport = new Viewport();

    viewport.X = 0;
    viewport.Y = 0;
    viewport.Width = GraphicsDevice.PresentationParameters.BackBufferWidth;
    viewport.Height = GraphicsDevice.PresentationParameters.BackBufferHeight;

    GraphicsDevice.Viewport = viewport;

    _platform.Window.OpenGLContext.SwapInterval = graphicsDeviceManager.SynchronizeWithVerticalRetrace;
}

There was a bit of other housekeeping to get the above code to work, so I include that only to show you the general idea. There is one other odd and important thing thing: in monomac, the MonoMacGameView class (parent of the GameWindow your game runs in) looks like this:

public void Run (double updatesPerSecond)
{
    AssertValid ();
    if (updatesPerSecond == 0.0) {
        Run ();
        return;
    }

    OnLoad (EventArgs.Empty);

    // Here we set these to false for now and let the main logic continue
    // in the future we may open up these properties to the public
    SwapInterval = false;
    DisplaylinkSupported = false;

    // Synchronize buffer swaps with vertical refresh rate
    openGLContext.SwapInterval = SwapInterval;

    if (displayLinkSupported)
        SetupDisplayLink ();

        StartAnimation (updatesPerSecond);
    }

You can see that vsync will be disabled no matter what you do (line 13). Fortunately, this only occurs just before the first Update. It does mean that you have to setup vsync after your Game has called Initialize(). I just added the following to MacGamePlatform.StartRunLoop():


public override void StartRunLoop()
{
    _gameWindow.Run(1 / Game.TargetElapsedTime.TotalSeconds);

    //The first call to MonoMacGameView.Run disables vsync for some reason, so  give
    //the game a chance to re-enable it here
    this.Game.applyChanges();
}

And then everything was right as rain.

The game is really coming together now. I gave the app packager/installer a whirl too and that looks pretty straightforward, which is a breath of fresh air compared to making an installer on Windows. More work remains to be done key binds, controller support, my loading screens and some resolution/alt-tab support, but I’m getting close! Oh, and I suppose I’ll need to look into Mac Store requirements sooner rather than later too 🙂

Running total: 7 days, $112.87

The Great Porting Adventure: Day 6

Day 5 ended with me realizing that an XACT-based audio system was probably going to be a bit of a bust using MonoGame. Sure, I had gotten the very basic “play a sound” functionality to work, but all the features I make use of in XACT are not likely to show up in the near future.

Speaking of which, I was asked why I use XACT at all. My main reason for trying it was its handling of music. XACT gives you nice crossfade options, as well as Category support (which in turn grants you Custom Soundtrack support on Xbox for free). It also gives you a nice layer of abstraction between sounds and the waves they use, meaning I can tweak the underlying sound of the game without changing anything in code. These sounds can also have nifty features like distance attenuation, instance limiting and automatic volume/pitch variance for giving your audio a bit of randomness. All of this is managed in reasonable functional application that allows me to keep my audio content separate from the game. To add new sounds (cues), I just update the XACT project and then call the corresponding cue from within code. I don’t need to update my game’s content project or loading, since it’s all handled through the “.xap” asset (which builds the associated wavebanks and soundbanks).

So XACT is great and now I can’t use it. But you can’t use it on Windows Phone either, so really it’s just Windows and Xbox. Maybe moving away from XACT is a prudent move, especially if I can get the same functionality out of SoundEffect.

It was time for a musical interlude!

Disclaimer: This is basically unnecessary over-engineering that I did for the heck of it. Don’t think for a second that this is required for porting to MonoGame, though it is part of my approach to porting.

I already added a layer of indirection to my audio system, so now it looked a little like this:

Pseudo-UML. Thanks Word art!

Coming up with an equivalent SoundEffect-based implementation was relatively straightforward, given that the features I use in XACT aren’t that advanced. I had to do a bit of work to keep track of instances (in categories) as well as music cross-fading, but it was all pretty simple to do. And my game didn’t need to change at all (other than to say whether it wanted to use XACT or SoundEffect).

The big wrinkle was how I was going to tell my game what sounds to load and what sort of features they would have (Category, volume/pitch variance, cue names rather than file names). After all, I want to just be able to play a sound by its cue name from the game, regardless of what the audio implementation is underneath.

One of the overarching goals of this whole project is to avoid maintaining different versions of anything just for the sake of porting. “Write once, play everywhere” right? If I need to maintain a separate list of sound effects to load (or have to add them to my project manually), it’s going to get out of sync quickly and be a huge headache.

But I already have a description of all the waves, cues, categories and events: the .xap file.

Here’s a peek at a small part of the .xap file for DLC Quest:


Category
{
    Name = Music;
    Public = 1;
    Background Music = 1;
    Volume = -260;

    Category Entry
    {
        Name = Global;
    }

    Instance Limit
    {
        Max Instances = 1;
        Behavior = 2;

       Crossfade
       {
           Fade In = 2000;
           Fade Out = 2000;
           Crossfade Type = 1;
       }
    }
}

Aha! A plain-text description of all the meta-data I’d need to play my sounds! Not in an easily paresable format though. It looked simple enough that I was going to try writing a quick and dirty parser. Then I found XapParse. From the Codeplex page:

A C#-based XACT project (XAP) file parser, intended for use as part of the custom content pipeline in XNA.

Hot dog, just what I needed! Here was the plan:

Add a new SoundEffectMetaData class to my audio system. Whenever a cue needs to be played by the SoundEffect-based system, it looks up the associated meta data and uses that to play the appropriate wave with the correct category/volume variance/pitch variance, etc.

Create a XAP Pipeline Extension project to:

  1. Read in the .xap file.
  2. Parse the .xap file into objects using the xapparse project.
  3. Construct a list of SoundEffectMetaData objects from the in-memory representation of the XAP project in Step 2.
  4. Write out the list of meta data to an .xnb file (automatic .xnb serialization is awesome – no work here!)
  5. Also, call context.BuildAsset to find and build the corresponding .wav file.

Step 5 is important, because it means that I can just build the .xap file and it will build all the required wave assets without me needing to add them to my content project.

Lastly, in my game itself:

  1. Reference the XAP Pipeline Extension in the Content project.
  2. Switch the Importer/Processor on the .xap file to use my new extensions and build.
  3. Copy the built list of metadata and all the built sounds to the Mac project.
  4. Switch the Importer/Processor back to the default XACT project to continue using it with the Windows/Xbox builds.

Step 4 here is simply because all content builds are performed on Windows and I need to build things a different way for Mac. I still need to come up with a nicer way of syncing content to my Mac build, but for now I can just switch the processor and I get everything I need for the SoundEffect-based implementation.

So now I can continue to manage all my audio inside XACT but still end up with the metadata I need for using SoundEffect when I choose. No duplicating effort, no managing two lists of sounds and their properties. Pretty slick, if a bit convoluted. But that’s why we program, right? 🙂


This is running a bit long, so I’ll take a quick look at what remains to be done:

  • Handle music properly. Right now it’s just a normal .wav SoundEffect, so the filesize is huge. Should probably be using the Song API.
  • Fix multiple sound effect instances (currently bugged in MonoGame, but there’s a promising pull request for OpenAL support that I’ll try out)
  • Fix vsync (still no idea where to start)
  • Investigate key binds
  • Investigate controller support
  • Bring back my loading screen
  • Supported resolutions list
  • Alt-tabbing? Minimizing? Losing focus? Etc.
  • App bundling
  • Test test test

Not going to finish within 7 days, but the game is looking pretty good right now.

Running total: 6 days, $112.87

The Great Porting Adventure: Day 5

Shorter update today, as I didn’t spend as much time working on the port yesterday. Things are still coming along, but I’m not there yet.

First off, I did a bit of testing to see what was already working. It seems like everything around saving and loading is functional, making use of the user directory seamlessly. Switching to full-screen also seems to Just Work, though I bet the Alt-Tab equivalent on Mac will need some handling. The list of supported resolutions isn’t being populated, so I’ll need to address that eventually. My use of render targets to perform a screen swirl effect also works nicely. I was initially surprised, but then I remembered that I use render targets for drawing anything in my game (to handle resolution-scaling easily), so it was clearly working already.

I then decided to track down the “jitter” that was occurring in my physics. The player character would constantly be moving slightly above ground and then landing, causing a sound and a puff of smoke several times a second. The likely culprit was my newly added handling in the physics manager’s step function that was chewing through updates in increments of 1/60f. It turned out that some rounding was causing updates of 1/60f followed immediately by another update with a very tiny time delta to pick up the remaining time in the frame. Evidently the frame time being passed in was just slightly larger than 1/60f. This tiny extra update wasn’t enough for the player to “fall” all the way to the ground, so his flags were updated to say he was off the ground and hence he would “land” the next frame. Ugly! But also an easy fix (and my fault – nothing to do with MonoGame).

Next up was VSync. I use vsync, but it wasn’t taking effect on the port. It turns out that vsync is just not implemented in MonoGame yet. I’m not sure how that’s an option for anybody – the screen tearing is horrible without it in my game. I spent a little while searching through the code and reading up on vsync in OpenGL and OpenTK, but it’s a problem I don’t actually know how to solve yet. I’ll dig into the Update/Draw/Present loop today I think – my game is probably pegging the CPU at 100% now anyway.

The next glaring issue I tackled is shown below:

I have around 99 problems and one of them is texture filtering.

Something was going a bit wrong with the way my textured tiles were being drawn, but only when the camera was in certain positions. I use Point (or NEAREST in OpenGL terms) texture filtering, so being “a little bit off” on texture coordinates causes the completely wrong pixel to be drawn. The pixels that are drawn are a consequence of the spritesheet I use – you’re just seeing the nearest adjacent pixel (which belongs to another tile type).

After ensuring that the correct sampling and filtering was being used, I used a debugger to verify that my vertices were being sent to the device with the correct texture coordinates. They all seemed correct, but they were also right on the boundary of the… texel (I think that’s the term I want) that I wanted to use. It seemed like occasionally the Nearest filter would pick the pixel just slightly to the left of the coordinate rather than the one just slightly to the right. It feels like I need a half-texel offset or something so that I would be sampling in the middle of the texel rather than the boundary. Regardless, it’s a bit beyond my understanding and the hacky fix was just to add a tiny constant offset to the texture coordinate, which did the trick. Sometimes the “right” fix is the one that works.

Now that the game looks more or less correct (minus vsync) and plays properly, I decided it was time to see just how broken XACT under MonoGame really was. I had seen this video showing an AppHub sample running with XACT sound on MonoGame, so I knew there was a least some potential. The first roadblock was that xWMA files are not supported. I use this for my music, as it saves me something like 95% filesize versus PCM. For giggles, I rebuilt my XACT project using all PCM format (for a nice 20x increase in filesize) and loaded it up. I spent another while figuring out that other things are not supported, such as using Volume/Pitch Variance and RPCs like DistanceAttenuation. Categories are also missing. Those are pretty much my main reasons for using XACT in the first place.

What’s more, the reading of XACT soundbanks/wavebanks is essentially black magic. None of the file format is documented, so the whole thing is a lot of guess work and tedious file analysis. It took a fair amount of work just to safely ignore the unsupported features I was using in order to have the basic sound effects work. I don’t see myself reverse engineering the binary format to get those features working.

To that end, I’ve decided to give the SoundEffect approach a shot. Taking a page out of computer science’s handbook, I added another layer of indirection to my audio system and can now swap out the XACT implementation for a SoundEffect-based implementation without my game being any wiser. But I still want to retain some of the nifty features, like volume/pitch variance. And I don’t want to have to maintain a separate list of .wavs to load. Fortunately, it looks like the .xap file for an XACT project contains a human-readable description of all the waves, sounds and cues in the project. If can process that at build time to generate some SoundEffect wrapper objects… hmmm… a new project for today.

That’s where things stand for now. I’m going to continue working at a replacement audio implementation. I’m also going to look into vsync – if anyone has any experience here, I’d love to hear from you.

Running total: 5 days, $112.87