Dave Kennedy once broke into a bank vault with a password he found in a worker’s desk drawer. The desk was locked, but picking it was trivial, and it took about 30 seconds to get the code that would give him access to safety deposit boxes and bags of cash. “Unfortunately,” he laughs now, “we had to give it back.”
Kennedy is a security consultant and co-founder of the cybersecurity firms Binary Defense, which defends against hackers, and TrustedSec, which simulates their attacks. “We get to be simulated bad folks but we don’t have to go to prison,” he says. “Which is awesome.”
Given the enjoyment he finds in playing at someone he’s not, it’s not surprising that Kennedy, now 36, is a lifelong video-game player. As a kid, he learned how to code so that he could run his own text-based online multiplayer game known as a Multi-User Dungeon, which later helped him get his first job in cybersecurity for the Marines. Kennedy thinks that video games could help people develop skills needed to meet the growing demand in the cybersecurity industry, especially now that so many games are designed to make the player feel like a hacker themselves.
The term “hacking” used to just mean exploring and understanding systems more deeply. But around the 1980s, the mainstream media began to use it to signal malicious intent: unauthorized intrusion through manipulation and exploitation of technology and, as with Kennedy’s bank vault, the people who use it. Hollywood took this definition and ran with it. The hackers of 1980s and ’90s movies were meddling kids going places they shouldn’t, like 1983’s WarGames, in which a teenage Matthew Broderick accidentally gains access to a military supercomputer and almost starts a nuclear war.
Video games have a long history of looking to Hollywood. Just two years after the release of WarGames, Activision CEO Jim Levy demonstrated their new game Hacker to journalists in 1985 by pretending his attempts to access the company server had…