Unless you’re a nerd (guilty!) you probably have absolutely no idea about what makes a game tick – what technology drives our favourite pastime? Now understandably there’s some of you who probably don’t give two flying side kicks about how it all works but for the budding closet nerds among us, we thought we’d help peel back the skin on gaming and peer into the engine bay for a moment and take a good look at all the whirring gears and blinking lights inside. We hope to bring you some insight in plain English and break down the mystique with our SNR TechnoTalks.

Have you ever thought about what makes objects, terrain and atmosphere (smoke, light, dust) appear and move on your screen? Traditionally, large amounts of any game maker’s budget are thrown at content generation where in-game bits are crafted lovingly by keyboard jockeys in windowless bunkers and then intricately placed into equally human constructed environments and then animated into that environment. Not only is this costly to production, but it’s even more costly to what is stored in your game files and in the final cut of the disk you place into your machine. Large environments and lots of stuff filling those environments are stored as assets and those assets take up space and processing power. The bigger the environment and the more stuff in it, the harder your machine has to work to keep all those balls in the air and running smoothly. A simple example we’ll use is animating a bird across the other side of a map from where you are standing. In traditionally made games, that bird is being animated in that part of the environment no matter whether you can see it or not from where you are standing. That’s taking up processing power while the game engine is running and when you multiply that by all the other assets being animated at the same time, you can quickly see how this becomes top heavy.

But that was then. This is now. Enter the new kid on the block – Procedural Content Generation. “What the hockey puck is that?”, we hear you ask. At it’s simplest, it’s a set of formulas that generates in-game assets only as they’re needed to be seen by you. Again to use the bird example, if the bird is not seen on your screen, it’s not being animated in the engine but the moment you move anywhere near where the bird should be, the animation seamlessly starts up and the area you left behind stops being animated. So as you can see, as you move from one cell to another, the animation overhead is controlled to be limited to only what you can see. This is a huge win for processing power, the size of the environment and the amount of assets within it. But what happens when you move back to your original position? How does it know where everything should be in relation to where you’ve just been? This is the cool part! The algorithm keeps track of where things should be without animating them and as soon as you come back to your original position, the animation picks up right where it should be. If you have any kids in your life (or even yourself) that play Minecraft, you’ve seen this technology working perfectly already. Pretty awesome, but it goes one step further still.

Have you ever played Borderlands? Did you know there’s over 17 million weapon variations in there? There’s no way even an army of humans could ever come up with that many separate weapons and then individually render them and make them work. PCG is used here to deliver more weapons than anybody could even imagine and keep things interesting. But it goes a step further again! The animation accompanying this post is from Star Citizen. It shows how PCG is used to make one long seamless in-engine animation from a person’s eyeball scoping all the way out to a huge planet, capturing the effects of the atmosphere with it. Everything you see on screen is animated just as you need to see it in real time. Game changing! (excuse the pun). But wait, there’s still more! The upcoming title No Man’s Sky has taken PCG to the next level by using the technique to not only animate things as you need to see them but also to generate whole environments, planets, galaxies, solar systems and an entire universe of over 18 quintillion possible worlds. Wow. The algorithms even go as far as determining things like whether a planet should have water, snow or minerals based on how close it is to it’s nearest sun. And even more interestingly, the creatures that inhabit each planet are also created with PCG based on the environments they are born into, taking into account the resources they have available and the evolutionary possibilities for that environment. This is mind blowing stuff! A whole game created organically based on physical and biological algorithms and rendered in real time as you need to see it. Welcome to the future, kids.

It’s clear that Procedural Content Generation is going to make it’s way into a lot of upcoming games and the boundaries are already being pushed as far as what can be achieved with it. But there’s a balancing act here and one that needs to be managed. The ultimate expression of a system that generates it’s own content is chaos! And that my friends is the art – controlling the amount of chaos that is allowed to exist while making sure it’s not game breaking. No Man’s Sky will be the benchmark for this contained chaos technique and we can’t wait to see what bizarre and alien oddities it generates.

1 Comment »

Got something to say about this?

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s