The Great Porting Adventure: Day 4

Here be dragons

Back to work after a weekend off. I’ve started to get some good feedback from other devs who are curious about the porting process. It’s great to see some on Twitter deciding to give it a whirl too – it’s always easier when you have more eyes on something. It seems like most are content to follow along and see what happens first. Honestly, that’s a pretty smart approach since I have no idea if this is going to work in the end.

So, day four. This was easily the worst day so far.

At the end of day three, I had my game all converted and compiling, but anything beyond my first screen wasn’t drawing properly. In fact, it was initially crashing trying to apply a texture that was never set (and never needed to be applied anyway). I put a quick guard against using such a null texture to get things running, but it quickly became obvious that things were Not Right.

Specifically, everything except my menus and the background was not being drawn at all. It looked a bit like this:

You could technically play the game using the brown pixels under the "Y" in top left.

Upon (very) close inspection, I could see some tiny moving dots in the upper left corner of the screen. It seemed like anything I was drawing with just a vanilla “SpriteBatch.Draw()” call was working, but the rest of my game was tiny and offset. All of those draw calls used a more involved Draw override that included a SpriteSortMode as well as a BasicEffect argument. Now I had to figure out why those calls weren’t doing what I expected.

This is where I spent most of my day. Stepping through the MonoGame framework, trying to figure out what it was doing and then trying to figure out what it should be doing instead. This isn’t easy when you don’t know what “right” looks like.

Armed with Reflector and the StockEffects sample from AppHub, I dug through the internals of XNA’s SpriteBatch, BasicEffect and its associated shaders. MonoGame is not the line-for-line port of XNA that I expected. I’m guessing that’s for for legal reasons, but maybe it’s for technical reasons. I don’t know. I do know that the implementation MonoGame uses is incomplete though. This makes sense, since only 2D is supported at the moment whilst BasicEffect has a lot of 3D features.

Eventually, I found this in MonoGame’s implementation of one of BasicEffect’s vertex shaders:

gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;

This was the first “Aha!” moment. Setting World/View/Projection on the BasicEffect modifies shader parameters, not built-in GLSL uniforms (by the way, did I mention I had to read up on OpenGL and GLSL?). This is why setting my own values had no effect! I therefore changed it to this:

gl_Position = gl_Vertex * WorldViewProj;

In the above, “WorldViewProj” is a uniform that is set by BasicEffect. Note that the order of multiplication is switched because matrices in XNA are row-major, whereas OpenGL uses the column-major convention. I think.

Compile, run….

No change.

Since I don’t really know what I’m doing, I played around with various different matrix values and shaders for a while with no luck. I would receive runtime errors whenever the shaders failed to compile or link, so I foolishly assumed they were working. I began to suspect something was seriously wrong when I modified the pixel shader to just return the color red and yet the textures were still drawn. Something was definitely not behaving as I expected. Whenever problems like this come up in the AppHub forums, the answer is invariably, “Have you tried PIX?”. I should look into some sort of PIX alternative – it would have saved me a lot of time.

It took a lot of investigation, stepping through code, and reading up on OpenGL to determine that although the shader program was being constructed, compiled, linked, set, etc properly, it was also subsequently being unset just before the actual draw OpenGL call happened. Argh! I used the following call to inspect what shader program was active in order to figure that out:

GL.GetInteger(GetPName.CurrentProgram, out programNum)

Since “programNum” came back zero, I knew OpenGL was using the fixed function pipeline and not BasicEffect’s shaders. It was easy enough to track down where the erroneous call to GL.UseProgam(0) was and resolve that.

With the shaders now actually being used, I was able to work my way back from “all red pixel shader” to a properly functioning BasicEffect, complete with World/View/Projection matrices. Drumroll…

DLC Quest on a Mac! Now with actual gameplay!

Yay! In-game!

That’s actually skipping ahead a bit, but I figured you deserved a picture after all that rendering rambling.

I had to do a bit more work to enable SpriteBatch tinting, but that was rendering solved for today. There were a few more issues though.

First off, a lot of my data (awardments, DLC pack definitions, etc) was not being loaded at all. It turned out to be my use of the ContentManifest, which is a sample from the AppHub. In a nutshell, the ContentManifest is an asset that builds a list of all the assets in your Content project at build time. The result is a list of filenames that you can read at runtime to iterate through the built files. I use this to facilitate a data-driven approach to things like awardments – I just drop an awardment .xml file in my content project and it will be loaded by the AwardmentManager automagically.

However, since all content is built on Windows, those filenames use the Windows file separator (“\”). When I read them at runtime on a Mac, they still have the (now incorrect) Windows separator and thus parsing them using something like Path.GetDirectoryName doesn’t work. I added a little fixup routine for the slashes and that was that.

The last problem was that everything was falling through the world.

This came as a bit of a surprise. How could that be platform dependent? Maybe a floating point precision difference? I was perplexed, and certainly not looking forward to debugging my core collision library.

I disabled gravity and glided my character around the world, only to find that collision worked just fine against vertical surfaces. I then added a toggle to gravity so I could turn it back on at runtime easily, which let me run and jump properly, but only if gravity was initially disabled. That was a hint that something was going wrong on the first frame.

Turns out that my physics step function took a “float deltaTime”, but made no checks that the delta was a reasonable duration. The first frame of my game was unusually long on Mac and thus all of the objects were simply tunneling through the ground on the first update. Whoopsie. I updated the function to chew through multiples of 1/60 sec rather than just whatever the deltaTime was and then things were back to normal.

This has been a super long update, but it was a long and frustrating day for the most part. Being able to run around my game and interact with things is a great reward though, so I’m thoroughly motivated to press on. There’s still a lot to do though. There seems to be some precision problem with collision after all, as entities are a bit jittery on the ground. Audio is still missing. I haven’t even tried the part of my game that uses render targets. Controller support is untested. Keybinding is an unknown. The list goes on.

And yet, I’m happy with my progress 🙂

Stepping outside your programmer comfort zone is how you become a better programmer. I’ve definitely got that first bit down.

Running total: 4 days, $112.87