In August 2017, Eddie Tipton was convicted for masterminding one of the biggest lottery scams in the U.S. As the Multi-State Lottery Association’s former information security director, Tipton had access to the software generating random numbers for the lottery. He inserted extra lines of code so when certain conditions were met, the algorithm would follow a different path, producing a smaller, more predictable set of winning lottery numbers. Tipton rigged six winning drawings across five different U.S. states amounting to a total of more than $24 million in prize money.
From lotteries and elections to cryptography and quantum mechanics, randomness plays a critical role in ensuring fairness — but only if the source of randomness can be trusted at all times. This may not always be the case for a single source, which could have its own bias or could be influenced by outside forces, as seen in Tipton’s lottery rigging. And this is where decentralization comes in, providing distributed sources of unbiased, unpredictable, and verifiable randomness, which could improve reliability and make it harder to compromise.
Generating randomness dates back centuries. Researchers from the University of California, Davis and the American Museum of Natural History found that ancient people would roll large, irregularly shaped items made of bone, clay, ivory, metal, or stone to determine their fate. This later evolved into games of chance involving the rolling of cubic dice — a classic example of randomness. With each roll, a die has an equal likelihood of landing on any of its six sides, but which side that would be is unpredictable. As the researchers wrote in their study, “Gamblers may have seen dice throws as no longer determined by fate, but instead as randomizing objects governed by chance.”
Randomness transcended games of chance, and in 1951, the world’s first commercially available general-purpose computer, the Ferranti Mark 1, was born. With it came a…