Sunday, September 1, 2013

Pulse Lasers

Finally I have implemented the Pulse Laser onboard system.  This was an exciting build, as it leverages a number of features I've been wanting to play with.

Ship System
Each Pulse Laser attached to the ship is a ship system, and has a power capacitor which draws energy from the main power unit.  When you fire, it fires continuously until the capacitor runs out of juice.  After that, the rate of fire for the lasers is limited to the capacitor's recharge rate.  What this means is if you let the lasers fully charge, you get an initial ultra-burst of pulse laser fire, followed by a slower volley that can be sustained indefinitely.


It's important to batch objects.  A GPU can draw 1000s of lasers in the blink of an eye, as long as you submit it with 1 or 2 draw calls.  If you draw each laser individually, you'd easily use up your 33 msec or more.

In Pegwars, a python pulse laser fire event writes a pulse laser into a single dyanmic vertex buffer.  The initial position and velocity is written into the vertices, and all of the animation is done in shaders afterwards.  This buffer is circular, and as such depending on the head/tail indices, it requires at a maximum just two draw calls to render 1000's of laser beams.

Lasers are bound to both the additive render channel and the bloom render channel to give a nice glowing-hot look.

Also, every laser beam also generates a screen-space light that looks great when shooting over the top of space stations or planets.

Here's a collage of the pulse laser lighting work-in-progress, from the very first blob of light, to attenuation work, debug shaders and finally the results - firing lasers through a set of buildings on a space-station :)

Oculus Rift Integration

There's still a lot of work to do to get the rift running smoothly, specifically I can't seem to get the thing to focus properly in fullscreen mode (but works great in windowed!).

I found the SDK alright, but a little basic.  The examples and API itself use a RH (right handed) coordinate system, which is not what DirectX uses, and I wish they supported both out-of-the-box instead of me having to map between them. "Oh here's a quaternion providing the headset rotation, but it needs its coordinate system remapped"..uh-huh.  Oh yeah, some documentation on using the rift with DirectX would be nice!!  OR am I the only one using DirectX these days?

In order to render for the rift, you need to draw your scene twice, with the camera offset by the distance between your eyes.  It's nice and easy, as the geometry of the rift lenses is such that you don't need to skew or rotate your projection matrices, it's just a translation.  However, drawing the scene twice obviously incurs a performance hit if you're not careful.  Thankfully the Stingray Engine I built for Pegwars builds up command buffers for drawing, meaning you can re-render just the scene's draw commands from two perspectives - without duplicating any setup work.  Nice!  My only problem right now is I'm creating light-maps twice, which obviously can be reduced down to just once per frame.  But overall, I was very happy with the rift rendering performance... certainly much faster than 50% of the single-view frame-rate!

I must say, it's pretty special to sit in a virtual 3D spaceship cockpit!

Sunday, July 7, 2013

Amusing Graphical Stuffups

One of the best things about working on a game is graphics bugs.  No, not the kind that make you spend a week delving beneath the covers of hideously complicated interactions between CPU API GPU RAM ETC.

No, the serendipitous kind!

Engage warp drive!  Funnily enough I was thinking about making some kind of FTL effect one day
Playing around with SSAO, I got some kind of cubist painting look

Ooops, wrong viewport!

Hmmm.. this is not the material you are looking for

Wrong light map!

Not quite the right amount of specular highlight

Wednesday, July 3, 2013

Your Oculus Order is AboutTo Ship

Hell yeah...Oculus Rift is coming my way.  Lucky I've been working hard on getting pegwars ready!

- User Interface now working, displays on consoles in virtual cockpit.  From day 1 I've tried to stay away from on-screen UI, and I think that'll be vindicated when using the Oculus.  Immersion is key!
- 'Star trails' object which is not really star trails, but a detail object that helps give the impression of movement and speed in what is essentially a very large, empty space.  50000 points wrapping and extruding and drawing in a single batch on the GPU
- The stars you see in this shot are now drawn by the galaxy class, which is an octree of solar systems.  And although it's hard to show in a screenshot, you can fly to every single system seamlessly using a combination of your turbo drive and turning off the engine stabilisers...
- I need to start creating procedural planet textures!  Planet named 'Masjou' but using Earth texture just won't cut it.

Now one more screenshot that is Oculus Rift focussed...looking down at your own knees in the cockpit!

- In-cockpit head-look is what pegwars has always been about, and that feature is really gone to shine with the Oculus Rift, it's really going to be the perfect match
- This shot shows the new UI again, this time the Control Centre system - with badly arranged overlapping system icons.  Every icon is a ship system, we've got (from centre-right, anticlockwise) Targetting, Radar, 2 Pulse lasers, 4 thrusters, nav computer, fighter bay and engine stabiliser.  Currently the thrusters and stabiliser are turned off because I've docked at the space-station :)
- Badly focused shadow-maps.  However ... they are at least working, like all of this stuff...needs polish!

Tuesday, February 12, 2013


Planets are now becoming well-formed.  They have a placeholder atmosphere object, but will get a dedicated scattering shader.

Their atmosphere detects entry and the Planet Flight App Module is instigated.  This sets up the rendering pipeline slightly differently.  Flags are set on the spaceship physics model, for atmospheric flight.

In future this app module will also begin the streaming of planet cities, detail objects, etc.

Friday, February 1, 2013

An Update

I have to get back into blogging this thing, so this update is just a refresher to remind me how to log in to blogger etc.  Here's a screenshot of the cityscapes these days :

What we've got here is

  • A tile-based city that is read streamed from a bitmap.  The tall building you see in the centre is an example of specific placement of buildings - it's not just a procedurally generated cityscape.  This is so that players will be able to grow cities on planets, but also do specific terraforming and building and truly customise their planets.
  • A nice example of arbitrary backgrounds used as image-based lights for the scene.  While the background is currently a static mesh and bitmap, it's implemented in a way that's totally dynamic.  This way when I get around to doing the atmospheric scattering simulation (for planets) or procedurally generated nebulae in space, the models will all sit in their surroundings no matter where on the planet you are and what time of day it is.  The environment is baked into a cubemap, which is used for specular reflections (see the sides of the buildings), SSDO (the colours in the shadows) and far-z blending (aka fogging)
  • Shadow-mapped sunlight
  • Shadow-mapped internal cockpit lighting
  • Depth blur
  • Player's spaceship with internal lights and fully dynamic control centre UI.  The ring of icons represent all the ships systems - there's 4 thrusters, 2 pulse lasers, a stabiliser and missile rack.
I've implemented the transition from Space Flight to Planetary Flight, which means the plumbing is in place for two awesome new features - the planetary atmosphere scattering simulation, and placing the cityscape on the surface of planets, such that you can fly down from space seamlessly into an environment just like the above screenshot.