On Star Trek: Beyond I had the luck to be put on the Yorktown team, where I had the chance to develop a full fledged city generator system to populate the Yorktown space station with cities.
The system I developed would take a city footprint consisting of numerous city blocks as inputs, and would simulate into place the buildings, using Houdini’s Bullet system. This enabled me to prevent the buildings from intersecting with each other, as they would collide upon contact. Using a pop-attract technique I would attract the buildings to the edge of a cityblock, forcing them to align to that edge, making them face the road.
Using Python I stored each city block and its relative location on the yorktown into a file, to later, once all city blocks had been simulated, pull them back in at the click of a button into their correct locations, forming the entire city. At this stage only the bottom sections of buildings (there were mid and top sections as well) were present.
A post process I wrote mainly in Python would then look at the attributes on these base sections to determine their type and height, to then stack on top as many mid sections as allowed by the skyline height. This height would differ across the city, bound by a minimum and maximum value. Once the maximum value was reached, Python would determine the final section to be a top section.
To define these minimum and maximum values I developed a painting system, using red values between 0 and 1, which would then be multiplied by the maximum height attribute. This system effectively allowed me, or any artist, to paint the skyline.
Using a packed primitive technique I would then write out the city as a point cloud, with stored on each point all relevant attributes needed to recreate the city in Clarisse, our lighting tool. The actual building sections would then be instanced upon my point cloud in Clarisse, which could then be rendered to reveal the final image.
Thanks to this point cloud instancing approach we were able to display the cities and the Yorktown itself, consisting of 1.3 trillion polygons, in realtime.
Due to the vast nature of the task at hand – a space station being 16 miles in diameter consisting of 64 of these cities – I had to automate and proceduralize the system as much as possible.
Again using various Python scripts throughout the set-up I was able to automatically drive this set-up. Generating a city would now take less than an hour, and as such we were able to quickly turn around changes in client briefs, lay-out, art direction and so forth.
Cinefex 148 makes mention of DNeg’s “new generation proprietary urban layout software” on page 77, of which my city generator system is one of three key parts.
I was lucky to be part of an incredibly talented team who taught me more in that time than I could have imagined. For this, I am immensely grateful.
Other FX work on Star Trek: Beyond includes volumetric lighting and rendering, creating moving debris clouds, smoke simulation/advection for nebulae and fog, and rigid body dynamics.
A few of my shots are shown below, as well as a video featuring my work, which I cut together from the trailers.