Stellaris In VR-is

Now available via Github for anyone who wants to take a look at the inner workings


How it works

A beautiful bodge

Manipulating the view

Resetting the view to the center of the galactic map uses a series of AutoHotkey commands to zoom out, drag the mouse to orient as top down perspective, and then open the console and use the “goto 0 0” command to center the map.

Opening panels is always handled via hotkeys, which means VR-is doesn’t actually have knowledge of which panels are open/closed, only estimations based on an expected blank start state and the history of hotkeys it itself has pressed.


Different panels

To achieve the effect of having multiple panels open at once, those that aren’t currently open in Stellaris have their view paused (e.g. opening the technology panel will capture and freeze the view of planetary administration so it can be referenced but no updates will be visible!)

Currently panels are updated if the player touches them with their controller

Going forward this should be expanded into a full ScreenManager class which controls each individual panel and chooses which is being updated when (perhaps based on which one is the closest the the player’s view, and so probably what they are interested in)


Ship/Planet selection

These are simply handled by using the hotkeys 1-9, hardcoded in Unity for mockup purposes to be the planet on 1 and the fleet on 2


Sending input to Stellaris

AutoHotkey moves the mouse cursor based on relative position of VRTK controllers over each panel


Loading Systems/Hyperlanes

The Stellaris save format can be converted to JSON via this tool, which allows VR-is to display an overlay of the map ontop of the main screen (seen in the video as pink hyperlanes)


Requested features

Player at the center of a holographic galactic map

Possible and pretty easy based on current features, placing the player in the center of the galactic map loaded from the save.json would allow them to interact with this holographic/resizable/rotatable map while the code transforms this input back into 2d screen space to click the mouse for fleet movement


Ship art based on the selected leader background

Possible I guess, just a lot of art work


Communication on main screen

Not sure how to detect when communication arrives, but otherwise easy to place in a panel in the scene somewhere


Home planet visible from windows

Possible but it would be a static view, due to only being able to have one camera!


Multiplayer, commanding the same empire

Possible by streaming Stellaris capture to all connected players, and sending back and input via AutoHotkey, however in this bodge solution they would quickly be overwriting/messing up each others commands as they all try to input at once

Maybe it would have to work like an old timeshare? Haha


Galactic community

I imagine this as putting on an AR headset overlay which would display the community voting chamber, and being able to see the actual empires in the balconies of the side they are voting for!

Unfortunately I don’t see how this could be possible currently other than trying to do image recognition on the empire flags, or just hardcoding it


Project navigation

Supports SteamVR currently but others would be easy to add due to VRTK

Assets/StreamingAssets contains all the AutoHotkey scripts:

  • panic.ahk – Useful in case you lock yourself in Stellaris through panel interactions (Ctrl+F1 will exit play mode in Unity)
  • resetview.ahk – Run when the player hits the “Reset View” button in VR, tries to recenter the view as top down and showing the whole galaxy




Tom’s Hotkeys (Stellaris Mod – adds ability to open Tech panel on T)

OBS (May have to scale down output resolution due to resource intensity of Stellaris and OBS and Unity)

OBS-VirtualCam (Target “OBS-Camera”, no buffered frames)


Unity 2019.3.4f1

Stellaris Save to JSON



Portfolio Page

Painting Prototype

Today’s prototype!

I was inspired by playing the beautiful Eastshade (an open world painting game) but was disappointed to find no painting minigame, so of course I just had to mock one up myself.

It was originally implemented with the plan of simply using the mouse cursor to paint but then I remembered this plugin which allows the Leap Motion Controller to be used in Garry’s Mod, and again – just had to try it.


How it works is actually really simple!

The player performs a short calibration at the start; by positioning their hand (yes, hand – the brush is just a prop :]) to define 3 corners of the game window.  Then their hand can be tracked relative to these to allow painting, with distance from the screen being converted to brush size.

So when the player decides on a perspective to paint a raw, normal screenshot is immediately taken and stored locally.  From there it’s just drawing to various render textures in order to ‘reveal’ the part of the image under the brush & applying the selected ‘paint’.  I’m using what I learned from the Magic: Anomolies to apply shaders & post-processes to the ‘paint’ to achieve different effects – these different styles are linked to the numpad as a sort of mock artist palette!  These render textures caused a lot of issues with them not handling transparency as expected & their lack of documentation – but the end result is so worth it!


I had forgotten just how much fun the Leap is!

We used it for our Tragic Magic jam game and the added physicality is always just a blast!

Magic: Anomalies


Took a break from working on the magic crafting system/UI to quickly prototype this idea my friend DrMelon suggested; Anomalies inspired by S.T.A.L.K.E.R.!

They are procedurally generated; each with a custom visual shader effect and physical buff/debuff effect (a poison debuff in the above video). The closer a player is to the center of the anomaly, the faster the effect is applied to the player and the more intense the visual effects become.

Magic: Crafting UI Implementation


Aha! It works!

(Left: regen buff applied by spell | Middle: Spell Crafting UI | Right: Debug output of spell reasoning)

I’ve now tested these basic example components with some friends online and can really see the system’s potential coming to life!

However I need to think some more about the layout of the UI if I want to be able to support some more complexity… One idea would be to stick with the foundations of the current design, but make it work recursively to allow as many components as the player wants. I made a quick mockup of this (shown below) but I worry this will also fast become cluttered and unreadable.

(Notice that EYE_TRACE has two possible subcomponents so these split the space and become half size. These pos offsets would all execute one after the other.)

Magic: Crafting UI


I made some quick mockups of a few different ideas I have for the way the crafting UI should look/how players interact with it.

Attempt 1, Spaghetti:

In my first attempt my thoughts were focused on the pure freedom I want to incorporate, which led to this really open but potentially really confusing/cluttered design.  While I like the idea of bigger spells having a convoluted/spaghetti feel to them, this isn’t inline with my desire for the player to be able to read/adapt their spells at a glance during fast-paced gameplay.

Attempt 2, Ordered Recipe Grid:

This more grid-based design solves these issues well I feel! Now it’s really obvious what components a spell can support, and it makes swapping things in/out as simple as dragging&dropping. Can’t wait to implement it and get a real feel for the possibilities of this system!

Magic: Crafting

Part of my experimentation with ideas for interesting magic systems

Something I always want out of magic systems is true spell crafting, whether its premeditated and UI based like in The Elder Scrolls, or done on-the-fly as in Magicka. In this experiment I’m trying to find a happy middle ground between the two; one that supports the depth and possibilities of a more complex system while still allowing players to adapt as they play & avoid spending too much time ‘stuck’ in the UI.

Here’s an example of how this spell is expressed in code:

local comp = {
    Name = "HEAL",
    Type = "SPELL",
    Cost = 100,
    Invoke = function( self, ply )
        print( "Try invoke HEAL" )
        local trigger = self.SubComponents["Trigger"].Value
        local invoke = function()
            local ent = MM_InvokeComponent( ply, self.SubComponents["Patient"].Value )
            ent:AddBuff( 4 )
            MM_Net_Invoke( ent, self.Name .. " " .. ent:Nick() .. " because " .. trigger )
        MM_InvokeComponent( ply, trigger, { invoke } )
    SubComponents = {
        -- Ent to heal
        ["Patient"] =
            Type = "TARGET",
            RequiredType = "Entity",
            Value = "TARGET_SELF",
        -- Trigger
        ["Trigger"] =
            Type = "TRIGGER",
            RequiredType = "Number",
            Value = "TRIGGER_HURT",
            -- Value = { "TRIGGER_TIME", 0 },
MM_AddComponent( comp )

Each SubComponent here will eventually be switchable in-game using a node based UI.

One idea for the possible gameplay is that players start with no knowledge but can find these components littered around the map; each one opening up new possibilities for spell combos.

Magic: Drawing from the world

Part of my experimentation with ideas for interesting magic systems

In this system mana, the resource consumed when casting spells, must be drawn from a physical location in the world.  Each player has a variable reach around themselves within which they can draw in mana.

(Player’s view. The dashed circle represents their reach. Notice that the inside is dark as mana has been drained from this area)

The idea here was to create a tactical element to gathering mana, which should force players to move around more as they deplete all sources in their area.


To achieve this in Garry’s Mod I used a 2d array representing positions on the map (no verticality). Whenever the array is altered, each client is notified to update their mana render texture (a top down view of the map with black squares where mana has been consumed). This texture is then used to project light onto the map from above, leading to the effect pictured.

(An aerial view of the map. Black areas are void of mana, having been consumed by a player)

Heavy Gullets: Portals

The goal of the portal gateway from the main lobby was to create an awesome first impression of the game, as this is players’ entry to each level it is the first thing they will see (and then continue to see often).

(The portal effect seen when travelling between levels. Sequence lasts around a second)

By curving the direction of the particles over time I was able to create the simple but convincing effect of the tunnel bending around in various directions before reaching the destination and spitting the player out.

(The effect seen before entering a portal; A hole is cut in the wall with stenciled rendering)

The portal has a gravitational pull on any nearby players, which was then extended to affect the plantlife (notice the tuft of grass to the left of the portal) – again with the goal of making the world seem more real through interactions and feedback.


And here’s a view of the full interaction:

Heavy Gullets: Plantlife

One of my core focuses while creating Heavy Gullets was polish & good game feel. I wanted to make the world feel really alive by adding reactions to the player in every aspect of it. This began with particle effects, however the aspect I’m most proud of is the plant interaction. If a player or bullet moves through a plant then it rustles and bends away from the collider (all clientside). This is a really simple effect accomplished by lerping the plant’s angle in the direction of the collider’s velocity, with some added squash and stretch to the scaling, but has a has a huge impact on the feel of the game!

Magic: Tracing Runes


Inspired by classics such as Arx Fatalis and Black & White, I wanted to create a magic system involving drawing.

(Tracing a simple rune, linked to making the player fly forward. In this case the rune’s design is (East:1,North:1,East:1)

The rune detection simplifies any drawn shape in to a list of 2D vectors – each rounded to one of 8 compass directions. These are then checked against all defined rune shapes and accepted within certain thresholds. Runes themselves are defined with an ordered list of compass directions and their corresponding distances (Direction:Distance).


Future Note: I went on to use this system in my virtual reality magic prototype VR! Photies

Rainbow Jam 2016 – Day 2

Second day of #RainbowJam16! Pathing & AI


Added some rudimentary pathing to the dungeon generation today to allow npcs to navigate & move towards target locations.

The room prefabs now contain interconnected AI pathing nodes which have their door connectors linked to other rooms: making the complete web of paths (I couldn’t use Unity’s navmesh system because of the dynamically generated dungeons).

The actual path finding would need more work to be efficient but I wanted to get something up and running as soon as possible for the jam.

Still don’t know what the game is.

Rainbow Jam 2016 – Day 1

Today marks the start of #RainbowJam16!


Using Unity 5 to create… something. Not sure what the game will be yet, but I wanted to have a go at some nice procedural dungeon generation.

I’m quite happy with what I have so far, it uses room prefabs with connection points to string together the rooms with grid based collision detection.

Currently needs some tweaking ensure it always generates with a minimum number of rooms but I’ve made good progress today.

Dare to be Digital: Post Mortem

Team Dziethew had a fantastic weekend showcasing our experimental game ‘The Gods Are Wanting’

(My team mate Dziek talking about our experiences)

Though our original plans to add large amounts of content to the game each night fell through (we spent our nights drinking networking instead), the public reaction to the game was still amazing to see.  People seemed to enjoy the pure fun and weird world of the game, which seemed to make up for the complete lack of polish (we only finished our first build two hours after the event opened).


We were so busy running the stall it wasn’t until the final day that I got the chance to quickly run around and play things.  I had a lot of fun with all the stuff I managed to play, but my favourites were definitely:

Gravity Pong by Amy Parent, a great twist to a classic game.

• Super Block Party by Sofa Squadron, really fun party game with plenty of polish.

Hedra by Kirsty Keatch, a lovely one touch infinite scroller.

KUBOT by Marios Michalakos, sweet minimalist puzzler for iOS.

• The Extraordinary Life and Times of Nigel Farage, Gentleman by Half Lamp Productions, fantastic satire piece with SIX unique minigames – so much effort & polish.


On the final night, as is traditional at the end of Dare, there was a Ceilidh held with good music, dancing, & a free bar.  A great end to a great event!


Thanks Dare!

Dare to be Digital: Indie Fest

It’s the first day of Dare today and 2 hours into the event we finally have a working game, come play!

So after prototyping various different games with Dziek we changed our idea again a few days ago, and spent the last night creating this game.

A four player build-your-own-monster-and-fight-them game focused on adding special limbs to your god to give them power.

Planetary Annihilation Modding 8 – Networked Play

Currently working on networking the standalone application element of the mod


(Art belongs to UberEnt & Planetary Annihilation)

Using the Lidgren C# UDP networking library, which has been really easy to work with so far.

Currently players can connect to games and take turns performing actions, which are validated by the server & then sent to all connected clients.


(The turn notification UI is just placeholder at the moment)

Planetary Annihilation Modding 6 – Galaxy Generation

With my Honours project done, I am happy to finally get back to this.  Working on some procedural galaxy generation.


(Art belongs to UberEnt & Planetary Annihilation)

This should allow for some more interesting layouts, as before I was just hard-coding the systems and the routes between them.


The generation works by;

• Randomly positioning a number of systems within a bounding box, while collision testing to ensure there is no overlay between themselves

• Connecting all systems within a short range of each other

• Grouping together all the systems which are linked

• Finding the shortest paths to link these isolated groups of systems together


It seems to work well enough for my purposes at the moment, but I could make it more interesting by adding some more random paths later on.

The systems also have a procedural name which just combines two strings from arrays of names and suffixes.

Planetary Annihilation Modding 2 – Progress!

Progress with the actual modification part, now when players build something they will construct their unique version instead of the default.  This may not seem like much but it forms the foundation of the concept, as it allows each player to have separate statistics and units.

Planetary Annihilation Mod 2

(notice that the tower’s name has been appended with a 0, as I was the first player)

Also thanks to wondible for the Instant Sandbox test environment mod

Planetary Annihilation Modding

I recently started playing Planetary Annihilation: Titans (PA) multiplayer with some mates & have been really enjoying it.

Planetary Annihilation Mod


As with any enjoyable PC game I quickly started looking into the modding scene, to improve our experience with preexisting mods and look into creating mods for the game personally.  I discovered that PA actually has a notable mod community (though fairly small in number), which was strongly encouraged by Uber Entertainment (being listed initially in their Kickstarter as “Advanced Modding features”).


I’ve decided to try and make my own multiplayer version of the game’s Galactic War campaign mode.  In this singleplayer mode the player moves through a galaxy map playing AI skirmishes and unlocking new technologies (e.g. new units or buffs) for each victory.  My current concept is an external galaxy map featuring the functionality of the campaign and allowing for new technologies to be unlocked, which I have created a small prototype of in the C# game framework Otter (as pictured above).  This external application will edit the base .JSON files describing each unit and export them into the game.


The major problem I foresee at the moment is that most of the unit .JSON files are not unique per player (excluding the unique commander units), which would cause any technology unlocked to be shared among all players.  My current low-tech solution to this is to make a copy of each unit per player and then alter their build menu UI to reflect this.


You can follow the project here:

Unity Physics Multiplayer

Started testing out the new Unity multiplayer system today.

Ritual Sim


I made a little physics sandbox with player interaction inspired by Citizen Burger Disorder, namely that the player has two hands which can independently raycast forward and pickup physics objects.


The new UNet networking system is much improved from the old implementation which I tried out a couple of years ago, as it is now integrated more closely with the scene object and component system rather than purely code based.


Source Code:

Game Jam – The Meatly – Day 3

By the end of yesterday I felt I had a pretty good grasp of the ChilliSource engine, so today I just did some more experimentation with its UI system.


I still didn’t have a game by the end of the jam, but I’m quite happy with having experimented with dialogue systems again, and having an attempt at isometric drawing.

The isometric style was easy to pull off in ChilliSource, as it is a 2D and 3D engine I simply placed the tiles on descending y values, rather than having to hard code the draw order.


I saw this pop up on my Twitter feed, which inspired me & moving forward I’m considering creating a procedural city with Kenny’s isometric assets (he has also provided the 3D source files) for my university Procedural Methods coursework (rather than the procedural terrain I was working on before).

Game Jam – The Meatly – Day 2

So far today I’ve been working on the random event dialog system.  ChilliSource’s UI system took a bit of experimentation to get working but now that I understand how to use it, it seems very powerful.  The event panel was easy to setup & was a nice change from building a UI system from scratch, as I did for Frontier Town‘s dialogue.


Here’s a video of what I have so far;


I’m happy with the way it’s coming together so far, especially the title screen, but in hindsight I should have planned more of the actual gameplay before I began.  For now it seems like my entry will be mostly about the dialogue system, with stats increasing depending on the options you pick.


Next Steps:

  • Game developer skills screen & progression



  • NPCs
  • Proper game city (will probably just load a map)
  • Vehicles

Game Jam – The Meatly – Day 1

I’m participating in The Meatly Game Jam this weekend as a practice run for the upcoming ChilliSource Game Jam, as I want to get some experience with the engine rather than wrestling with it during the proper jam.


The theme for this jam is “All things TheMeatly: Life as a Game Developer” which is pretty wide open.  I’ve decided to make a dialogue based game in which you play as an aspiring game developer, going to game jams and studying to increase your skills.  I’ve also been meaning to make a game using’s isometric city assets, so I’m going to incorporate them into the game.  I’m planning for it to be similar to Game Dev Tycoon, but more dialogue based.



The jam started at 16:00, and so far I’ve spent 5 hours experimenting with ChilliSource & making my title screen.  A lot of what I have so far is placeholder, including the title, but here’s a screenshot of what it looks like;

Game Dev Story

Next Steps:

  • Dialog boxes
  • Random events
  • Game developer skills screen & progression



  • Proper game city (will probably just load a map)
  • Vehicles