Wednesday, March 31, 2010

Lighting the cockpit, pt. 2

Changed exporters to read object name, and initialise the file/save dialog with the correct name. This should make it much quicker to export objects and also makesexplicit and easy the rule that you must save the mesh file to the same name as the object. i.e. As long as you name your object, export is as quick as pressing run exporter and OK.

Added 4 lights into the cockpit. This runs slow! It's now doing 5 passes over the entire scene with shadow-mapped lights.

Optimise the lighting - get lights to cull their drawCommands against both the camera and the light. Skipping doing any culling at the Python Scene level, done immediately in c++

Solution - LightingRenderChannel has lightCuller_ and cameraCuller_ these run lightculler then cameraculler.

Bug - You can see black squares around the light frustum. Fix by turning off the AmbientLightingChannel.
Looking into why meshes bound to AmbientLightingChannel are causing the lighting anomaly

ambient_cube_map.vsh : Outputs the projected position and the world normal to o0 and o1

ambient_cube_map.psh : Samples from the dynamically generated ambient cube map

ambient_cube_map.mat : [cubeRenderTarget] ambientCubeMap

Problem - AmbientLightingChannel is now culling to the light frustum. Out of frustum, all there is is SSDO colours.
Hmmm - See following code in LightingRenderChannel:

if ( rtName == "AMBIENT" )
multiplicative_ = true; rtName_ = "lightMap1";

This explains the lighting bug; ambient lighting is hacked into the standard LightingRenderChannel, and is culling against the first light. removing :
- lightCuller_.culled(bb) the bug is fixed, however the performance improvement is still not in place
- fixed with a hack in lightingRenderChannel : don't cull by light when this is 'ambient' channel.. erk

Proper fix :
- python property on LightingRenderChannel - whether to cull to light's volume or not
- or, don't use LightingRenderChannel for ambient lighting - its more just like a post-process

Now need to cull the light projection matrices to their far attenuation radius. Then will need to actually do the attenuation too.

Quick thoughts :
The near/far plane should be passed into most shaders as shader constants from C++
Idea : have shader files/xml sections include 'state blocks'. These files describe, for example, sets of shader constants such as those explicitly hard-coded in c++ right now. Thus instead of 'set lit world mesh shader constants', you wrap those up in a '.stateblock' file, treat them as stateblocks just like .material files, and render channels can optimally sort by these.

Final screenshot shows an intermediate step from last night with just the lighting channels turned on. There's the main front white light, and the red light from the cockpit spilling out onto the land. This was before I added the attenuation in from the lights in Max, and this is why it was running so slow! Looks pretty cool though, and it's nice having an arbitrary number of full shadow-mapped lights to play with :

It's really useful with the way the render channels are setup, that individual channels can be turned off. Debugging lighting is much easier when you can toggle all the other stages of the pipeline off at runtime.

Sunday, March 28, 2010

Lighting the cockpit

Fixed light map projection by setting the wrap mode in TextureStage to border. This may affect other maps, yet to see any side-effects yet.

Cockpit is now properly attached to the spaceship, and has its own lights working. If you link a 'CockpitPos' object when making a spaceship, the cockpit will be attached in that spot.

Cockpit camera now uses cockpit-local transform from 'Camera/Transform/Position'

Screenshot shows one outside light and two in-cockpit lights

Tuesday, March 16, 2010

That's all for now!

Ok so now I'm up to date, I've blogged all my diary entries I had in a text file on the computer. The posts won't come as frequent anymore because they'll happen when they happen, and before they happen I'm going on a 2 month road trip and moving up to Darwin :)

After that I hope to settle into a new place, a new lifestyle and new PEGWARS development.

Retrospective 15/3/2010

Made a new city block, this one a little park with 8 trees.

Added option to include Logic=****** in the user properties, this is initially
so I can export City Pieces with WrapWorldPosition already there.

Used the new .object exporter, works well.

Retrospective 14/3/2010

Got many boost::python things working now that I am using shared_ptrs. Always knew I should have but was too lazy to do so, it finally became necessary. The main thing is I can now access worldObject hierarchies *safely* and pass IData back and forth from python.

From this I now get WorldObjects loading, creating Python LogicObjects from the object files, and now have an onLoaded call. All Logic Objects get onLoaded called once the entire WorldObject is loaded.

Lights are now loaded from .object files, and they are controlled via LightLogic scripts (which currently have to be loaded via the ScriptLogic wrapper class).

The LightLogic currently sets the light matrix based on the position of its worldObject. I should make a C++ light object that is intrinsically tied to world objects.

Also created is the ControlCentre script. This will be responsible for running all the ships' systems. First up I'll just be using it to shoot pulse lasers.

Retrospective 5/3/2010

Loads done since the last posts. Just a quick comment though.

I was getting no WorldObject::Ptr to python by-value runtime errors, simply trying to return a child smart pointer. I thought boost would have taken care of this since I carefully made sure all my WorldObjects are now reference counted via boost::shared_ptr.

However, to fix the runtime error, I had to declare that my WorldObject::Ptr was indeed simlar to a WorldObject by changing the class declaration :

class_ >("WorldObject", no_init)


class_ >("WorldObject", no_init)
Funnily enough I didn't have to do this with IData::Ptr, weird

Retrospective 22/10/2009

Now to tweak I need realtime control over shader constants, python access would be good enough.

So I'm going to try and hook in with the ShaderManager and register values by shader name and register. Done.

Retrospective 20/10/2009

Debugging SSDO, although only at the SSAO stage.

Just changed the depth_normals shader to output from the vertex shader and dividing z by w in the Pixel Shader to get the desired depth. This should be the correct depth, and removes the /50000.0 hack in the previous vertex shader.

The SSDO effect is very sensitive to its world radius, you can either see the small details or the large details.

But doing importance sampling on the filter taps allows for a much vaster range for the effect to be visible.

Gonna randomise over time the taps.

The effect reallly needs 64 taps to look good. So I'm going to try and downsample the depth image and do SSDO on that.

Now trying to change the visibility function. Even the testVisibility function gave decent results.

Retrospective 22/9/2009

Now trying to get lighting working using the new Scene construct and support for n lights. Split LightScene into a write and read phase, each with their own RenderChannelStrip

The goal of this is to have the draw order for all render channels interleaved to look like this:
light1 write
light2 write
camera1 draw depth
light1 read
light2 read
camera1 draw materials

camera2 draw depth
light1 read
light2 read
camera2 draw materials

This is now done. Now there are heaps of Render Channel Strips, each describing one stage of the pipeline.

Multiple lights are not completely working, as they have no current way of utilising a separate render target (all are hard-coded to use data/lightMap1.xml)

If lightX write, lightX read can be customised to use any render target, the lighting can be performed.

OK gots this all pretty much working, for n lights. The light maps seem at least to write correctly.

The throwing of the light on the objects doesn't yet work.. and i think its because in the LightingRenderChannel it sets no states whatsoever to do with the light's view matrix - so can't be right.

lights (1 and 2) and (3 and 4) seem to be reading from each other's depth maps. but if you set 1 and 2, you get a consistent shadow map, separate from 3 and 4. But why would they influence each other?

Retrospective 26/8/2009

Thing are going so fast it's hard to keep track. Heres some current thoughts:
  • All is working as planned, benefits of thoroughly testing the python scene library before using it in PyPegwars
  • There are a bunch of Python classes mirroring C++ classes, and confusing things right now, e.g. MeshPiece, RenderChannelStrip
  • The 'planets' aka light probes are being rendered via worldObject bind to CameraChannelStrip.
  • The lightmaps are being rendered via worldObject bind to LightChannelStrip.
  • The lighting on the objects is happening via the old system, i.e. C++ world objects binding to render channel strips.

How do I have 3 camera scenes, i.e. Far, Near and Cockpit, all rendered from the same viewpoint? how do I share a camera around?
Currently there are 5 scenes:
  1. farScene
  2. scene
  3. cockpitScene
  4. camera's CameraScene
  5. light's LightScene

TODO tonight
  • Get rid of the python RenderChannelStrip concept entirely
  • Change WorldObject to store (material,drawCommand) map instead of (renderChannelName,drawCommand) map

  • Added newmaterial / newmesh format, and late binding/creation of world draw commands
  • Added c++ MeshPiece

Retrospective 25/8/2009

Goal : get planets and suns rendering. This requires displaying the render channel strip that the world objects are currently being bound to.

Changed the "DrawScene" update command to actually draw a scene, instead of a render channel strip. This is because the scene has extra stuff like a camera/lens to setup rendering with - it may also have render targets etc. Basically we need to go via the full RenderMethod::standard fn because it does important stuff like clearing the back buffer and calling beginScene()/endScene(). In the end we may not want to use RenderMethod::standard

Split out RenderMethod::standard to ::beginScene and ::endScene

Problem now is the 'debug render channel' uses / clears the depthTexture. Alternatives here:
  • 'debug render channel' only has 'Gui' and 'Console' channels
  • makes old modules (dog fight, architect) not work
  • make them work with the new scene architecture ALL DONE
TODO : get rid of the 'debug render channel' entirely. Instead have this created by, and registered with the appropriate instances.

Got all old modules and new working with the RenderChannelStrip concept

Tried to get lights working, as that will greatly expand the number of objects i can load, i.e. planets.

  • no lighting on objects. If you get "MeshPiece" and add a "Lighting" object to it, then the mesh piece will get bound to the light channel
  • currently only the light objects have a "lighting" object. This is actually how it is designed - for deferred lighting

New Scene Library
  • has CameraScene, LightScene, owned by cameras and lights. These have CameraChannelStrips and LightChannelStrips. Those classes setup the appropriate render channels, and when an object is added
  • supports n cameras and n lights

  • has a strip and explicitly sets up render channels, combining lights+materials+debug all as one giant strip. Supports 1 camera and 4 lights only.
  • has useful addWorldObject / addMesh fns that the new scene library lacks. However, with the new scene library, try to keep all the data loaded in from scene files and avoid explicitly creating objects wherever possible.

Retrospective 24/8/2009

Created all the old Pegwars Application Modules
ObjectPool - added error msg to report objects that have no vizCallback
Working on SpaceFlight module.
  • should this module create a solar system, or have one passed in? doesn't matter yet, defer this decision.
  • got it promoting planets and suns to the appropriate far/normal/cockpit scenes
  • made first link between python and c++ game : planets + suns now *have a* world object that represents themselves.
  • planet and sun created by solar system and added to SpaceFlight's scene.
  • SpaceFlight has far/near scenes, ticks all objects and culls them between far/near objects.
  • bindMesh for sun/planet now binds the 'debug python mesh' and its world object.
Created C++ RenderChannelStrip ( instanceable RenderChannelManagers )
Replaced WorldObject::visible with bind( RenderChannelStrip ) and unbind(RenderChannelStrip )
Added DebugChannelStrip for now, a singleton channel strip used by python console / engine statistics etc and also the one used by, architect etc. things using the OldScene idea.

Monday, March 15, 2010

Stingray : the new engine

Stingray is organised into a Channel strips. Each has a list of RenderChannels, and each of these have a list of DrawCommands.

The strips and channels are created and arranged in an order specified by the game programmer. This design means that a Stingray game is not bound to forward-rendering, or deferred-rendering, single or multi-pass materials etc. It is completely flexible from this standpoint.

Channel Strips are purely for convenience, they are repeatable blocks of Render Channels, for example implementing shadow-mapped light requires light-map-write and light-map-read channels. These two make up a LightMapChannelStrip; each light-mapped light thus creates its own LightMapChannelStrip.

The fundamental drawing unit in Stingray, the DrawCommand, exists so that all draw commands can be sorted with respect to one another. This allows each render channel to sort its draw commands to draw in the most optimal order. For example the depth-pass channel requires little state changing, and so sorts its draw commands front-to-back to take advantage of the GPU's z-occlusion culling. The Texturing render channel on the other hand utilises heavy state-setting, and thus sorts its render commands via state blocks. DrawCommands are highly frame coherent, and this fact is taken advantage of - draw commands are not issued to channels per-frame at runtime, instead draw commands are owned by objects and meshes, and meshes are bound() and unbound() infrequently to channel strips.

Mesh files contain a list of MeshPieces - parts of a mesh that have the same material.

Material files contain a list of RenderChannels that draw a mesh piece, for example texture, light, then post-process. For each render channel, XML invokes class factories to create DrawCommands. This means a material may contain passes that are not used by a particular game; if a game only implements some render channels, then parts of the material simply won't be bound at runtime. Thus we have a dynamically-bound multi-pass material system.

When a mesh is bound(), all its mesh pieces use their materials to register draw commands to the appropriate render channels. In some instances, a draw command will be bound to multiple channels - for example, there may be two cameras with complete render channel strips, or five lighting channel strips. Cameras and Lights are therefore treated exactly the same, they are simply render channel strips.

All resources are referenced by integer IDs and stored in Managers, thus any resource can be reloaded at runtime, and the instance memory is kept to a minimum.

Stingray uses Boost::Python to expose nearly all of the engine to Python. There is no inherent render pipeline, this is created by the game python scripts.

There is no multi-threading in this engine, however it is designed for it. Draw Commands are independent objects, thus getting the draw commands ready on render channels can be done in any order, before the final rendering stage goes through a sorted list of draw commands and submits them to DirectX. So each render channel can cull and sort draw commands in parallel, only the final submission must be done in order. And before this is done, it is possible to double buffer the draw command lists such that the next frame can be made ready while draw command submission is taking place.

What is PEGWARS?

Pegwars, or Political Economic and Galactic Warfare, is my hobby computer game that I've been writing since, um about October 1997. That's 12 years ago!

For anyone that has been around computer games a long time, the word ELITE should mean something to you. Something special. ELITE was the game that hooked me for life. Until ELITE, games were a fun curiosity, usually involving run and jumping a little man , llama or 'Horace' around 2D levels, requiring pixel-perfect accuracy and some bleepy soundtrack that these days is considered somewhat 'retro-cool'. ELITE was a full 3D (wireframe) space combat exploration and trading game, which involved 8 entire galaxies, alien races, missions and more. All of this on a 48K ZX Spectrum. For an 11 year old boy, that was quite something, and my mind was truly blown.

Ever since university and being able to write somewhat coherent code, I've been writing PEGWARS, my interpretation of this genre. It's my hobby project, and it is now in its 6th iteration. On the way down this merry merry path of late night coding, I've learnt about art, music, game programming and more; I've used it as a demo to get a job in the computer game industry, it's catapulted me into writing electronic music, and it's kept me learning all these years and given me something creative to do and a good reason to avoid television.

This blog is going to be my project diary of the PEGWARS project, a place to document the long-term development of a hobby computer game project.

Pegwars, v1. 2D

My first version of Pegwars, written using DirectDraw 2. Involves such fun as implemented transparencies by ORing colours directly into the framebuffer.

Went on a bit of a tangent in this version, this one had enemy AI that was trained using a Genetic Algorithm. Each enemy had a short DNA string that dictated what they would do given per-frame sensory input. I even had use of a bunch of computers at my workplace at the time to run evolutions of these DNA pilots overnight. Moral of this foray into GA's is, if you want to run many evolutions of something, don't make it a full graphical game - test their fitness in a simulated game environment. Otherwise the amount of time required to run just 100 generations of 100 pilots is going to be far too long and you'll get nowhere.

Pegwars, v2

My first 3D version of Pegwars, and this one was written entirely in software. As you can see, there's no texturing, some pretty crappy 3D models etc. This was a 3D layer on top of my 2D engine written for v1, thus this still involved a fully working 2D interface.

This version featured fully customisable spaceships, you drag and drop engines, thrusters, pulse lasers onto your 3D spaceship and then join the game. If you forgot to add Thrusters you couldn't go anywhere! And that was the gist of what Pegwars was all about - fully modular and customisable spaceships.

Pegwars, v3.

My first 3D accelerated version, written using GLIDE and then DirectX. This one featured fraggable spaceships, even down to the polygon level! Plus some more pretty bad programmer-art. I used this version for my demo to BigWorld Technology, and it got me a job in computer games, surely a dream come true :)

After this version I started to really learn about games programming and decided that I had it all wrong... time to start again

Pegwars, v4.

The first one featuring my new Relativity Engine. Didn't get too far on this before deciding to start again. But as you can, it certainly featured fully modular ships again, just look at all those thrusters...

The look is starting to get a little more refined, at least I had the FOV sorted out (no more egg-shaped suns or planets). The guys at BigWorld taught me a lot about writing computer games and it was now that I was starting to get the feeling that maybe I'd bitten off just a little too much. My advice to anyone beginning their own hobby computer game is, don't be too ambitious! It's possibly the worst idea in the world to begin writing your own entire universe simulation in your spare time. Which leads me to the next version of my game...

Pegwars, v5.

Also featured in the montage image in this post, v5 was the furthest I got and the best Pegwars yet!

In this version, you can literally fly down and land on any planet in the entire galaxy, then hop out in a motorbike or car and drive around! You could land on space stations, fight enemies and collect bounties, even advance your rank.

A really nice feature that just kind of fell out of this game was sunrise and sunsets :) You just fly down into the atmosphere of a planet, and start flying around it. Lo and behold, as you fly around the night time towards the sun, you slowly see the sun begin to rise, the atmosphere lighten up sunrise colours through the sky. You could keep flying around as the sun rose in the sky then slowly sets behind your spaceship.

This version took years of work and unfortunately it showed. The code started becoming unmaintanable, I could see huge inadequacies in the graphics engine design, and there was no scripting language. Everything was still being written in C++ and this bogged down the speed of developing any new part of the game. I wanted to do more with the graphics but the engine was too inflexible. I took this version of the game as an inspirational prototype and have begun work on an entirely new version of Pegwars, the current version that I am writing now. And this time, I am making no compromises.