Gaff, Gleam, and Reminiscing About Project Mercury
New post within a month!
Lately I’ve been having a lot of fun doing some cross-platform development. I’ve got two libraries I’ve been working on. Gaff, which is my general purpose, stick whatever I feel like in it, library thing. It encompasses a lot of stuff, such as timers, personal implementations of standard data structures (i.e. vector, list), and other useful things that I make C++ wrappers for.
Gleam is the other library I’ve been developing. When I was on Project Mercury, I spent probably a good 4-6 months working with WebGL. While DigiPen does have a lot of graphics courses, I’ve never actually had the privilege to actually build a graphics engine from scratch and integrate everything myself. It was either barebones rush job for a project cause no one was a graphics expert (and we didn’t have the time to learn), or I was using an already pre-existing engine. So, Project Mercury and the Wildman tools I was building was me just fiddling around and seeing how everything worked and building myself a solid understanding of the fundamentals of computer graphics. So, with my piqued interest, I decided to learn me some Direct3D 11 and make a graphics engine. Originally it was just going to be a Direct3D 11 only renderer, but then I caught something while reminiscing about the Project Mercury days and decided to go full bore cross-platform, OpenGL 4.3 and Direct3D 11. Of course, along the way I also basically had to implement my own version of SDL from scratch, which is proving to be a lot of fun. I was extremely excited when I got a window popping up in Windows and Linux without any modifications to client-side code.
Anyhow, that’s what I’ve been working on lately. Gleam is nowhere near ready for actual use yet (I don’t even have anything drawing on the screen yet. Still getting the Window class to work exactly the same under Windows and Linux), but I’ve been going a while without a post and I needed some content to talk about. Until next time!
AFTER POST STORY TIME:
Talking about Project Mercury brought some memories back. So, the Wildman tools were not the only thing I built in WebGL. In fact, I had an entire demo application I was using to build my WebGL graphics engine. I had lighting, shadow maps, frustrum culling, object picking via querying a bounding volume hierarchy, and animated models! I was able to import models made in Milkshape (it was the only open file format I could find at the time) and they were completely animated. I even had a feature where you could pause the animation and turn on joint editing mode and you could modify the skeleton of the rig in real-time. I even had a system where you could capture key frames in-engine and play them back as a new animation, all without using external tools or anything. This probably doesn’t sound very impressive, but as my first serious attempt at computer graphics, I was very pleased with the results I achieved. Not to mention I also had a ton of fun!
So, by the time I was moved over to Wildman and started working on the tools, I had already done a ton of work on my WebGL graphics engine (which I made as a library that could be used in other Project Mercury applications). I think I had been working on the Wildman mod tools for about a month (or close to) and had everything you see in that video up and running minus the 3D preview window. Then the week before we shot all the footage for the video for Project Mercury, I was asked to make a 3D renderer for the mod tools. Normally I would give myself more than a week to add something like that, so I was understandably concerned at first. However, I did make the time frame with some time to spare! It took me two days to get a Wildman model imported and rendering and properly integrated with the rest of the Wildman tool suite. After that I think it was just another two days to construct the visual representation from the level data I was generating and add some lighting.
While I was implementing the lighting for the Wildman tools, I realized that my way of handling lighting was kind of expensive and not very flexible. That’s basically another way of saying, I had implemented a forward renderer. So I started working on a deferred renderer, which I never finished due to Wildman getting canceled. While I had most of a deferred renderer up and running, it was impractical for WebGL. Why? Because, in WebGL, you can only bind one color texture per framebuffer! This means, in order to get all the information I need for a deferred renderer, I’d have to render everything as many times as I have information I need generated. So, if I want the base color, position, and light contributions, I’d have to render everything in my viewport three times! Not the most efficient way of doing that. Direct3D and OpenGL allow you to bind more than one texture to a framebuffer, allowing you to render the scene once, but output different information into each texture all in one shot. I think multiple color attachments is part of OpenGL ES 3.0 (WebGL is basically a JavaScript implementation of the OpenGL ES 2.0 specification), and the next version of WebGL is targeting that spec. Or at least I think it is. I can’t remember where I might have read that.
Anyhow, my post-post stroll down memory lane is done! I hope you liked my mini-post mortem of Project Mercury/Wildman! Project Mercury was a really cool project that I was proud to be a part of. I’m a bit sad to see it go. :'(