Profile photo

Liam de Valmency

Game developer and programmer, currently helping to make awesome games at Media Molecule.

Personal Projects

My spare-time game creations can be found here on

I've recently started learning to 3D model; my progress can be found on Sketchfab.

I'll usually post screenshots/videos of what I'm working on in my spare time on Twitter.

Dreams - PS4 (2014-2017)

Since graduating in 2014, I've been working at Media Molecule on Dreams, a user-generated content game for the PS4.

Dreams empowers players to make anything they can imagine, in intuitive ways, with no barrier of entry other than a PS4 and a controller. Players can create sculptures, environments, scenes, games, animations, music, and sounds, then share all of their creations with the world. They can also explore, play, listen to, and watch the creations of the rest of the Dreams community.

Release date TBA.

VRunner (2013)

A VR free-running game, created as an entry for Oculus/IndieCade's three week VR Game Jam, in which the player is an antivirus agent within a computer. My first experience with learning/using Unity, and first game jam. The game features five levels, free-running puzzles, riddles to answer, and a fully animated player body.

Download the Windows build of the game (requires an Oculus Rift DK1 to play).
Find the source code on BitBucket.

Dissertation: Procedural City Generation (2013)

As part of my degree, I wrote a dissertation which provided an evaluation of existing techniques for generating cities procedurally, which found that most existing methods based their generative processes on heuristics on the nature of city layouts, which limited the level of accuracy. I set out to develop an algorithm which would generate city layouts based off of real-world data.

The final algorithm was based on texture synthesis techniques, which aim to take data from an initial image, and extrapolate from it to fill gaps or expand the image. The end result was a novel algorithm which takes city data input in SVG format, and uses a grid of city patches to piece together a new layout. This has the advantage that the generated cities mimic the style of the input maps.

Below are examples of two city layouts generated by the algorithm. The first is based on an input containing map data from Chicago, in which the roads and buildings are well-aligned. The second is based on an input containing map data from Southampton, where the roads are more irregular.

Procedurally generated cities

Find the source code on GitHub.

A paper describing the technique has been published in the Irish Machine Vision & Image Processing Conference.