Saturday, July 29, 2017

Atmosphere Rendering, part 2

Still doing work on my Atmosphere Shader, however it's proven to be quite the resource hog!

The current implementation is a fairly typical screen-space shader.  It ray-traces the light from the sun through the atmosphere until it hits the current pixel fragment, calculating the sun transmittance along this ray.  Then it marches that ray towards the camera, blending it over the background performing in and out-scattering again until it reaches the final fragment colour.

Incorporating Mie and Rayleigh scattering in the ray-tracing gives the following results :

Adding the clouds into the mix (as a prototype) gives results with I think outstanding potential.  However the added number of calculations in the shader pretty much takes it to breaking point, so much that I can't even prototype what it will finally look like, because I have to reduce the number of ray-marching steps in the shader.   Put simply, the shader is brute-force ray-tracing with no real optimisations and has reached its limit.

So my next step will be to split apart the atmposphere rendering into two steps.  Firstly, I plan to precompute a 3D Volume Texture containing the results of the light transmittance from the sun to each visible point.   This 3D Volume Texture only needs to cover the sunward-facing side of the planet, and I plan to parameterise the volume as a set of concentric hemispherical shells.

Once the precomputed volume texture is available, the screen-space atmosphere shader only has to do the ray-trace from the fragment surface to the eye, and for each point along this ray, it can simply look up the sun-fragment transmittance, rather than calculate it on the fly.

This will vastly speed up the shader by removing its costly inner-loop.  And in doing so, it will vastly increase the quality of results, because I will now be able to step at a higher frequency, and incorporate better algorithms for the scattering phase functions.

I'm still planning more optimisations, and think that a 3D Air Density volume texture can be precomputed also.  This will aid both calculating the sunlight transmittance volume, and the realtime raymarching algorithm.

I can't wait to see what the atmospheres look like after the optimisations :)   It's worth mentioning that this method of shading the atmosphere allows for fully dynamic clouds, which I intend to run as either a particle system, or cellular automata on the GPU, or a combination of both.  With fully dynamic lighting and seamless volume rendering, it's going to be a real asset to the game.

Tuesday, June 14, 2016

Screenshot Progress Update!

I've made a lot of progress recently focusing on refining my procedural planets and atmospheres.  Added into the mix are multi-core particle systems, procedural nebula and AI.  I'm almost ready to post videos but am excited that I have now fully tied in the galaxy to the rest of the engine, meaning totally seamless travel between every star in the galaxy all the way from warp-drive-speeds down to landing on and running around planets in first-person.

Enough talk, here are some screenshots of my recent explorations :

Some kind of desert planet, looks a little bit like Australia!

Here's a shot of my prototype cloud system.  The clouds are image based, the idea being to use a particle system or maybe cellular automata on the GPU to create procedural cloud maps for each planet.  The clouds are incorporated into the atmosphere scattering shader, meaning that they get nice sunset lighting and they also cast shadows into the atmostphere.  Plus you can place objects in, and fly through the clouds seamlessly.

Flying down to and then crash-landing on a planet somewhere

The atmosphere shader now shades the sun very nicely at sunrise, giving it that deep red glow!

Some random exploration screenshots

Showing off the procedural moons as well as planets

Taking a cruise early in the morning

This solar system has a dark sun, making everything gloomy and spooky

Looking up at the sun from a crater on a moon

Taking off from some kind of pink-crystals planet

This planet has a huge crater lake on it

A valley of forest in a grove of crystals

Just exploring some more planets

Now that I have AI spaceships and have implemented combat, the game is starting to feel like a real game more than a space sandbox!

Friday, November 27, 2015

Procedural Planets

I've started work on the procedural planets, here are some initial screenshots.  Like everything in Pegwars, it all has to be seamless, from flying in outerspace to orbiting planet and moon surfaces to landing and running around.

Each planet's geometry is formed by tessellating an icosahedron.  This shape was chosen to help make each vertex as equidistant as possible in the final mesh; tessellated icosahedra perform better in this regard than tessellated cubes or parametric spheres, and have the added bonus of having triangular faces, making it easy to tessellate and render.

Each triangle face of the icosahedron is recursively split at the edge midpoints to generate the next level of detail of geometry, as needed.  For performance reasons, as you don't want to be allocating GPU resources at runtime, Pegwars has a fixed set of vertex and index buffers for planets and moons - meaning a maximum of 10 planets at once, each with up to 3 orbiting moons.  Once I improve the level of detail algorithms, these restrictions will be able to be lifted.

The surface parameterisation for planet metadata and texture mapping is the cube map.  Each planet has generated for it a cubic height/slope map, texture map, object map and normal map.  Later on I will add cubic sub-surface mineral maps for resource extraction as well.

The reason for generating cube-maps for objects (and later minerals) is that the player will be able to customise any planets they have control over, to terraform, build structures and mine resources.  These will be updated in the underlying cubemaps and persisted to the game save storage.

Currently each planet has 9 levels of detail for the basic tessellation.  This provides decent enough resolution down to planet flight, but not enough to generate a convincing on-surface tessellation.

Because the topology of the planet meshes never changes, one level-of-detail index buffer is generated at startup and reused for every planet and moon.

Procedurally generating the heights for millions of vertices as well as calculating diffuse maps, normal maps and object maps takes a lot of CPU time, it is definitely not a real-time task.  So in order to achieve this during gameplay a brand new multi-core job system was created.

The job system allows arbitrary tasks to be run in the background in parallel across multiple threads; in addition there are 2 dedicated task threads that run on the main (rendering) thread - these are for final DX resource allocation tasks, as well as general main-thread tasks that are amortized over time to avoid frame-rate impact.

When a player travels near enough to a solar system, the background jobs start generating the solar system's planets and moons.  By the time the player travels anywhere near a planet or moon, its levels of detail will have been created, along with all associated cube map metadata.  This allows for seamless traversal from outer-solar-system flight all the way down to the planet surface.

In addition to the planet itself, I've also started work on procedural ground objects.  These objects are generated based on the underlying diffuse map (as the diffuse map for the planet is really just a hack to describe the make up of its actual surface detail), with other inputs such as the slope and relative slope.  I'll also add to this an equatorial value (colder at the poles) and once ocean and river systems are added, proximity to water.


A seamless universe is important to the Pegwars experience, and that means procedural atmospheres you can see from outer-space all the way down to the ground.

The Pegwars atmosphere shader is drawn in screen-space, and composited over the screen using the depth information from both near and far depth textures.  The shader itself simulates Rayleigh and Mie Scattering by ray tracing through and integrating against the atmosphere volume, taking into account the air density at any point based on altitude and what's in the depth buffers, the relevant ratios of gases and particulates, as well as other procedural variables.

As well as drawing to the screen, the atmosphere is also drawn into the near cube-map for reflections, ambient lighting and the all-important SSDO.  This way objects fit in the scene nicely, no matter what planet you are flying around.

Sunday, September 1, 2013

Pulse Lasers

Finally I have implemented the Pulse Laser onboard system.  This was an exciting build, as it leverages a number of features I've been wanting to play with.

Ship System
Each Pulse Laser attached to the ship is a ship system, and has a power capacitor which draws energy from the main power unit.  When you fire, it fires continuously until the capacitor runs out of juice.  After that, the rate of fire for the lasers is limited to the capacitor's recharge rate.  What this means is if you let the lasers fully charge, you get an initial ultra-burst of pulse laser fire, followed by a slower volley that can be sustained indefinitely.


It's important to batch objects.  A GPU can draw 1000s of lasers in the blink of an eye, as long as you submit it with 1 or 2 draw calls.  If you draw each laser individually, you'd easily use up your 33 msec or more.

In Pegwars, a python pulse laser fire event writes a pulse laser into a single dyanmic vertex buffer.  The initial position and velocity is written into the vertices, and all of the animation is done in shaders afterwards.  This buffer is circular, and as such depending on the head/tail indices, it requires at a maximum just two draw calls to render 1000's of laser beams.

Lasers are bound to both the additive render channel and the bloom render channel to give a nice glowing-hot look.

Also, every laser beam also generates a screen-space light that looks great when shooting over the top of space stations or planets.

Here's a collage of the pulse laser lighting work-in-progress, from the very first blob of light, to attenuation work, debug shaders and finally the results - firing lasers through a set of buildings on a space-station :)

Oculus Rift Integration

There's still a lot of work to do to get the rift running smoothly, specifically I can't seem to get the thing to focus properly in fullscreen mode (but works great in windowed!).

I found the SDK alright, but a little basic.  The examples and API itself use a RH (right handed) coordinate system, which is not what DirectX uses, and I wish they supported both out-of-the-box instead of me having to map between them. "Oh here's a quaternion providing the headset rotation, but it needs its coordinate system remapped"..uh-huh.  Oh yeah, some documentation on using the rift with DirectX would be nice!!  OR am I the only one using DirectX these days?

In order to render for the rift, you need to draw your scene twice, with the camera offset by the distance between your eyes.  It's nice and easy, as the geometry of the rift lenses is such that you don't need to skew or rotate your projection matrices, it's just a translation.  However, drawing the scene twice obviously incurs a performance hit if you're not careful.  Thankfully the Stingray Engine I built for Pegwars builds up command buffers for drawing, meaning you can re-render just the scene's draw commands from two perspectives - without duplicating any setup work.  Nice!  My only problem right now is I'm creating light-maps twice, which obviously can be reduced down to just once per frame.  But overall, I was very happy with the rift rendering performance... certainly much faster than 50% of the single-view frame-rate!

I must say, it's pretty special to sit in a virtual 3D spaceship cockpit!

Sunday, July 7, 2013

Amusing Graphical Stuffups

One of the best things about working on a game is graphics bugs.  No, not the kind that make you spend a week delving beneath the covers of hideously complicated interactions between CPU API GPU RAM ETC.

No, the serendipitous kind!

Engage warp drive!  Funnily enough I was thinking about making some kind of FTL effect one day
Playing around with SSAO, I got some kind of cubist painting look

Ooops, wrong viewport!

Hmmm.. this is not the material you are looking for

Wrong light map!

Not quite the right amount of specular highlight