Can Fancy New Graphics Tech Save Burned-Out Game Developers?

Ray tracing looks great, and it could alleviate crunch if companies let it

Credit: Shadow of the Tomb Raider/Square Enix via NVIDIA

RReal-time ray tracing, the next big innovation in gaming graphics, could eventually make your games look more like the CGI you’re used to seeing in movies. Companies like Nvidia are touting it as a revolution in computer graphics, but its biggest benefit may be easing the burden on the overworked game developers trapped under perpetual crunch time today.

Ray tracing is a powerful technique in computer graphics that simulates the way real-world objects should look. The software casts virtual rays of light and traces (get it?) the path from the light source to the camera, just like light works in real life. In the same way light from a lamp bounces off the book you’re reading and into your eyes, ray tracing blasts CGI objects with rays and captures them in a virtual camera.

This enables the program to intuitively cast natural shadows, light objects properly, and create fast and easy reflections. If a lot of rays hit an object, the object is brighter. If an object blocks a light source, it casts a shadow. Where the old method was like tediously drawing an animated movie, ray tracing is more like shooting a live-action movie. You can hang the virtual lights, and it all just looks right.

The problem is that ray tracing takes a lot of processing power. Until recently, rendering even a single frame with ray tracing on a typical computer could take hours. That’s fine when you’re making a movie like Toy Story 4 and have huge server farms at your disposal. It’s less feasible for video games that have to generate 30 (or 60) frames every second.

Instead, video games traditionally use a technique called rasterization that creates a 2D image of a scene from the camera’s perspective. Then, shaders are applied to that image to approximate what light, shadows, reflections, and other details should look like. Think of it like trying to paint a circle that looks like a ball. A good artist might be able to get close, but nothing will ever be as accurate or as easy as taking a photograph of a ball.

“The nice thing about ray-traced shadows is that they’re… just kind of algorithmically right.”

The innovation in “real-time ray tracing,” therefore, isn’t necessarily the ray tracing part—it’s “real-time.” Nvidia currently has a few graphics cards on the market capable of real-time ray tracing, and Sony says its next gaming console will support the feature as well, whenever it comes out. It’s likely Microsoft’s Xbox will eventually follow suit. A decade or so from now, real-time ray tracing will be commonplace, allowing developers to create realistic graphics without all the tedious workarounds.

That future can’t come fast enough for overworked game developers.

Stories of video game studios operating under “crunch time” — a prolonged, semipermanent period of working late nights and weekends to get games done before a deadline — are as common as they are depressing. Rockstar, the developer behind the Grand Theft Auto franchise, had developers working 60- to 80-hour weeks to meet Red Dead Redemption 2’s deadline. Epic, the company behind online megahit Fortnite, acknowledged a few cases of employees working up to 100 hours a week in what some workers consider a constant crunch lifestyle.

Much of that excessive work is spent on very minor but important details. As Tony Tamasi, senior vice president of content and technology at Nvidia, explained to OneZero, the process of getting shadows in just the right spot — called shadow mapping — can be tedious and time consuming. “It’s a pretty good amount of time developers and artists have to spend custom-tweaking things to get shadow maps to look right… And those are the kinds of things that ray-traced shadows all just magically solve for you.”

Shadow of the Tomb Raider, which uses Nvidia’s RTX platform to render shadows, shows how much work ray tracing can save. In the first picture below, you can see a few shadows cast by the wooden spikes onto a stone wall.

It looks fine, right? Intricately detailed, even. These shadows would’ve been created by an artist or developer who crafted the shape of the shadows, adjusted how they behave as Lara moves her light around, and makes sure there are no glaring bugs. It takes time, but the end result looks pretty good.

Now take a look at that same location with ray tracing turned on.

Look closely. Did you even notice in the first image that there was no shadow coming from the little flags? Or the ropes? Or the rocks behind the spikes? With the flip of a switch, every object starts casting shadows, not just the ones that are designed to do so. The shadows are also more accurate, getting softer the farther away the shadow is from the object casting it.

Cody Darr is an independent developer and creator of Sonic Ether’s Unbelievable Shaders, a mod that turns Minecraft’s basic block design into a detailed thing of beauty. Early versions of this mod used the old rasterization technique, but now Darr has switched to a ray-traced model. “I used to spend countless hours tweaking various faked lighting techniques to achieve the look I wanted,” Darr explains. “With ray tracing, those endless hours of tweaking to find the right balance for fake lighting effects have vanished. Everything just looks right!”

“Technology powering game development has only gotten better, yet crunch time has only gotten worse.”

For now, most players don’t have ray tracing–capable hardware, so game studios will still need to rely on the old methods. Ironically, this will add more work to support both rasterization and ray tracing, but not much. “Yeah, it’s not free,” Tamasi says. “It’s some amount of incremental work. There’s no doubt about that. But the nice thing about ray-traced shadows is that they’re… just kind of algorithmically right.”

Ray tracing can still help developers, even if many players can’t take advantage of it yet. Tamasi explained another technique during the development process called light baking. This involves precalculating what the lighting in a scene should look like and saving that information as a texture on the disk in what’s called a lightmap. Building lightmaps can take a long time, and changing them even slightly takes a long time. “Every time an artist changes the light or moves the position, or you change the material, you’ve got to rebake it,” Tamasi says. With real-time ray tracing, he adds, “What used to take hours can now take a couple minutes.” Even before real-time ray tracing reaches consumers, it can save developers loads of time.

Further into the future, when ray tracing becomes common place, developers can start to drop older techniques in favor of these easier, faster methods. Whether that leads to less crunch time for developers is another question. “Sadly, I think it’s more of a cultural issue,” Darr says. “Technology powering game development has only gotten better, yet crunch time has only gotten worse.”

Some jobs in game development — like writing or QA testing — won’t benefit from the ray-tracing revolution at all. And some companies may fill the reduced workload with yet more work. It’s not a magic bullet. But there is an opportunity to use this as a chance to rethink the workloads put on the people who make games.

“It’s easy to blame game studios for not caring about the quality of life of its employees,” Darr says. “But gamers usually don’t think about the quality of life of game developers when they play a game. They think about the quality of the game itself, and game studios know that.”

Eric Ravenscraft is a freelance writer from Atlanta covering tech, media, and geek culture for Medium, The New York Times, and more.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store