How the Government’s Multibillion-Dollar Plan to Modernize Its Tech Could Go Horribly Wrong

Our ‘Kill It With Fire’ moment

Jennifer Pahlka
OneZero
Published in
9 min readApr 20, 2021

--

Photo Credit: Edwin Levick via Getty Images

The amount of money that is about to be thrown at modernizing government legacy systems is staggering. The rescue package alone allocates several billion dollars to it, and that’s on top of what’s been spent over the past year trying to make systems at the federal, state, and local levels rise to the pandemic occasion. Philanthropy is opening its pocketbooks as well. For those who’ve been wishing for this for years, we are in a big “be careful what you wish for” moment. So, folks, what’s the plan?

Top of the list will be state unemployment insurance systems. And as Biden and team look at how to dramatically improve our home care economy (thank goodness!), Medicaid will be right behind. No one wants to build a great policy framework for paying home healthcare workers better and upgrading the quality of that care, only to find that it’ll take 10 years to figure out how to change the administrative and payment systems — especially after watching the unemployment insurance systems fail so tragically in their moment of great need. Everyone wants these systems to magically, suddenly, be resilient, adaptable, scalable. It’s an understandable wish, and one I share, but it’s a tall order. I believe they can be all those things, but it will happen neither suddenly nor magically.

In this context, Marianne Bellotti’s new book Kill It With Fire is incredibly well timed. For those alarmed by the provocative title, rest assured the only thing Bellotti advocates torching is the notion of torching itself. And while the book is written for technical leadership, her wisdom is something many nontechnical government leaders need to hear right now, lest they fall prey to the gaggle of advisers saying things like “we just need to get them off the mainframe.” She writes:

When people assume that technology advances in a linear fashion, they also assume that anything new is naturally more advanced and better than whatever they are currently running. Adopting new practices doesn’t necessarily make technology better, but doing so almost always makes technology more complicated, and more complicated technology is hard to maintain and ultimately more prone to failure… Changing technology should be about real value and trade-offs, not faulty assumptions that newer is by default more advanced.

Real value and trade-offs are a good start to the grounding Bellotti provides. I wish she talked more about the role of technologists in partnering with policy and program leaders to refactor legacy policy alongside legacy code, but she does a great job reminding us that everything we know about software development still applies when you’re in a high-stakes modernization project:

The funny thing about big legacy modernization projects is that technologists suddenly seem drawn to strategies that they know do not work in other contexts. Few modern software engineers would forgo agile development to spend months planning exactly what an architecture should look like and try to build a complete product all at once. And yet when asked to modernize an old system, suddenly everyone is breaking things down into sequential phases that are completely dependent on one another.

This is the danger of our current moment. The legitimate desperation of the situation, especially in unemployment insurance, will encourage those who say, “For once, we must do something bold!” To them, “iterate in place” (Bellotti’s default strategy choice) sounds decidedly timid. There are billions of dollars to be spent, and someone needs to be told the plan for spending them. “Build the capacity of the teams who run the service to understand, simplify, and scale their processes and technology” (my words, not Bellotti’s exactly) doesn’t sound like a bold plan. But replicating Kafkaesque business processes driven by decades of accumulated policy cruft on more current technology and attempting a “hard cutoff” to this new platform (which Bellotti correctly points out is the riskiest strategy in the modernization toolbox) is a recipe for disaster. The truth is that billions have already been spent trying to modernize this way, quietly, over the past two decades. We can’t keep doing that.

Replicating Kafkaesque business processes driven by decades of accumulated policy cruft on more current technology and attempting a “hard cutoff” to this new platform is a recipe for disaster.

Allowing these government functions to continue to operate with so little internal technical capacity and so little understanding by their leadership of their own systems is also a recipe for disaster. When I co-chaired a “strike team” appointed by Gov. Newsom last summer to look at problems in delivering unemployment insurance to Californians, my colleague Marina Nitze observed that she could count on one hand the number of department staff who understood how their own system worked from start to finish. It was clear that the leadership of the department did not have a grasp on the complex and fragile set of intertwined policies and technologies that resulted in both the remarkable number of claims that had been processed and the enormous backlog that had also accrued. A few of the vendors had some fragments of useful insight; none of them had the full picture. But the degree of reliance on vendors to figure out what to do was crippling.

The problem with the “just get them off the mainframes” thinking is that it presumes that you can solve the problem with technology. What struggling government services need is capacity — at all levels. Giving them technology without solving the underlying problems is like sending fancy surgery equipment to a clinic without doctors — it’s not clear that’s what’s needed, and they may not know what to do with it. If you somehow got lucky and got them exactly what they needed, the equipment only solves today’s problems. When conditions change, you’re right back where you started. Because you lack the core capacity and competencies needed to get the job done.

My time with California’s unemployment woes this summer drove home how little it would matter to “just get them off the mainframes.” The mainframes were doing quite well, actually, processing an order of magnitude more claims than the prior year and double what the Employment Development Department had ever done before. What was causing the problem were areas that didn’t scale, like identity verification and processing incoming mail. And the meta-problem was that no one on the team had the time, energy, or perspective to properly diagnose the bottleneck or explain it to those insisting on misguided solutions. The legislature was hammering the department to hire faster — adding thousands of new people each month — without anyone realizing that every new hire actually reduced overall productivity. Burdened with an 800-page training manual covering all the Byzantine policies and processes, training new staff took many years, so new staffers were busy learning while the experienced staff were busy teaching, leaving essentially no one to process claims. What they needed wasn’t a migration from a mainframe. They needed a compass and a rudder; as it was, they were weathering a hurricane with neither.

The problem is not the mainframe. The problem is that it takes 10 years to train a new claims processor.

We have seen what happens when we modernize for the sake of modernizing and over-rely on vendors without regard to internal capacity. Almost half the states had modernized their unemployment insurance systems prior to the pandemic.* It didn’t help. With very few exceptions, all the states, modernized or not, struggled mightily to deliver benefits accurately to claimants in a timely manner, to combat massive fraud, and to report accurately. What’s unclear (perhaps just to me, perhaps also to the leaders in charge) was the goals of these modernization projects. Given that the need for unemployment insurance is cyclical and these systems accrued backlogs in the last down cycle, the Great Recession, one can only assume that ability to scale to meet the next one had been part of the goal. If so, oops. If not, perhaps that’s a decent starting place.

A starting place is what everyone trying to figure out how to spend this money needs. Bellotti has a good suggestion:

I prefer to restrict the scope by defining one measurable problem we are trying to solve. Building a modern infrastructure is not a goal. Different people naturally are going to disagree on which standards and best practices should be enforced and on how strongly they should be enforced. Few real-life systems completely conform to an ideal; there are always at least one or two places in systems where a nonstandard approach was used to make a specific function or integration work. Everyone knows these compromises exist and that they probably will continue to exist in some form or another in the new system, but it’s unlikely the organization will be able to agree on when and where to introduce them.

But if all the work is structured around one critical problem that you can measure and monitor, these conversations become much easier. You start by looking for as many opportunities as possible to make the problem better and prioritize them by the amount of estimated impact. When there is a disagreement on approach or technology, the criteria for the decision becomes: “Which one moves the needle further?”

This is the question leaders should be discussing right now. Not how to get unemployment insurance or Medicaid systems off the mainframes. What is the one measurable problem — out of many! — that we can start with, that will make a difference in delivering benefits to the American people? From there, we will need competent, thoughtful, empathetic teams that can partner deeply with the people who know the services and systems in question, and move forward together, not pointing fingers at one another. That doesn’t mean eventually we won’t move these services off mainframes, but that’s so far down the line it shouldn’t even be in sight yet.

The good news is that there are a lot of people in government with the experience, skills, empathy, and mindset to get this right. The teams at the United States Digital Service (where Bellotti earned some of her government stripes) and its sister group 18F (which sits in the General Services Administration) have spent the last seven years learning these lessons. They bring up-to-date tech and design know-how to the table, but they’ve also gotten very good at partnering deeply with the owners of these services to set meaningful goals, refactor policy and process alongside code, and help agencies and departments deliver services that work. Our new federal CIO, Clare Martorana, is a USDS alum who represents the best of this approach and has successfully employed these practices as CIO of the Office of Personnel Management. The amazing Robin Carnahan, formerly of 18F and U.S. Digital Response, will lead GSA once she’s confirmed. And there are many like-minded people in some of agencies most critical to this agenda. The Biden administration has appointed people like Michele Evermore, a former advocate who’s seen how these systems go wrong on the front lines, to be a senior policy adviser at the Department of Labor. Rebecca Piazza and Lynn Overmann, alums of 18F and the White House Office of Science and Technology Policy, respectively, both now hold similar positions at the Department of Agriculture (which administers SNAP, our country’s most successful anti-poverty program). Other like-minded champions are either already working at or soon to join the Department of Health and Human Services. All of these people have an incredibly hard road ahead. But the all-star lineup that’s emerging gives me great hope.

The good news is that there are a lot of people in government with the experience, skills, empathy, and mindset to get this right.

The money coming at modernization is not too much. There is much that needs to be modernized. It is only a problem if the people in charge of spending it conflate bold with big. Bold would be different from what has come before. Bold would be clear goals and empowering cross-disciplinary teams to make decisions that drive toward those goals quickly, building momentum through wins that show actual value. Bold would be iterating on policy and technology together. Bold would be building long-term capacity.

The amount of money coming at modernization is only a problem if the people in charge of spending it conflate bold with big.

Bellotti’s book could not have come at a better time, and while there are other factors in this equation, she outlines some of the most important. I just hope there is an audience for her message.

--

--

Jennifer Pahlka
OneZero

Author of Recoding America: Why Government Is Failing in the Digital Age and How We Can Do Better, Fellow at the Federation of American Scientists