Military Waste. Joshua O. Reno

Читать онлайн книгу.

Military Waste - Joshua O. Reno


Скачать книгу
provide explicit justifications for what they do when impelled to do so by others (Keane 2016, 78–79). And distinct social situations can direct and deflect blame as a result. Inside the microworld of arms manufacturing and military contract procurement, for instance, one is more likely to be called upon to explain one’s work performance in certain ways and not others. As mentioned, in public discussions of the permanent war economy, blame is normally assigned on the basis of perceived self-interest—politicians want reelection, military personnel want more power, and corporations want more money. But in military production, blame is determined internally through product testing and audit.17 Put differently, testing products is a reflexive practice with greater social implications, beyond customer ties alone. That is because, in addition to demonstrating technical reliability that workers may take pleasure in, testing is an ethical activity that offers preemptive defense against blameworthiness, an alibi that allows engineers to manufacture products with a belief that they did the best they could.

      Generally speaking, corporations in a neoliberal era use auditing practices to reduce various risks to the durability and reliability of their products by increasing the predictability of their processes, including employee performance (see Shore and Wright 2000, 85). What this means for Lockheed employees is that at various stages in the design of a product they must reflect on product success in specific ways even before they pitch a project to the military. The more important the product, the more that is invested in predicting and preventing (or rather delaying) the product’s failure and therefore mitigating any person’s sense of accountability. More scrutiny with safety-critical products means more testing of software itself, but not necessarily more environmental testing. Simon explained this to me in detail:

      There’s a formal thing called a risk analysis, and you do a twenty-year risk analysis. This is a brand-new aircraft, brand new radios, brand new mechanical, computer brain that’s gonna be self-aware. What are going to be the technical challenges? All these things either won’t be built, can’t be built. . . what’s the plan? So you do a formal risk analysis and say, “At this point in time if we don’t have this design mature enough, we have to go to plan B or plan C.” And this formal risk analysis is approved.

      Here, formal risk analyses translate “waste” into a series of tests and procedures that arguably deflect responsibility by distributing the labor involved in creating instruments of violence. Testing and auditing also mean inviting other people into the product assessment process:

      Another thing Lockheed Martin does (and I assume other companies do) is, you present your design to an internal panel of what’s called “Wise Old Owls.” These are old scientists and engineers that have been through all of it before, and you first do a red team and a black team and they can shut down the Lockheed Martin effort and fire all us engineers if they don’t think we’re gonna win.

      These internal “gatekeepers” monitor and assess projects and can have them thrown out for technical reasons, but Simon did not seem concerned about the human or financial cost, likening it to a form of blind peer review (perhaps because I was his interlocutor, and he was searching for an analogous domain of socially mediated audit).

      If internal competition and risk analysis can lay waste to projects, in anticipation of their presentation to the client, this is actually meant to save time and money in the long run. If a project is committed to for twenty years, then even more capital, human and financial, is at stake. But too much process analysis can also be wasteful, from the perspective of Lockheed workers. The new face of manufacturing is not about being productive, Bork told me, but predictive. He said a lot of time and resources are now invested in cultivating reflexive attention to processes. What does this look like in practice? As Bork put it:

      So that you have a set of rules that will always give you an expected result coming out. “I have this much time to develop this, and based on my processes, I know I need this many people, this amount of time and this amount of money, and in the end I’ll have what I’m asking for.” So it gives you predictability, that’s the big thing.

      If a process was supposed to take you an hour but actually takes much longer, you want not only to shorten the amount of time it takes, but also to understand why you had it wrong the first time. This means asking not only what happened, but also how one can improve in the future. This is not bad, necessarily, but since new approaches are rarely implemented uniformly or perfectly, it ends up taking even longer to do one’s work. It is easy to get cynical about predictive management for this reason:

      You’ve got the new darling method of development that’s making the rounds right now: Agile. This is the latest round of Kool-Aid. Every six or seven years, there’s some new paradigm of how to develop software that makes the rounds, and it’s gonna be the savior of everything and everybody, and it’s gonna make us so much more productive. The DoD has actually asked for more programs to be Agile. And they’re the main client, so everyone says “Yes, we’re Agile!”

      Agile is a project management method that attempts to divide up tasks into small, two- to four-week bursts of activity, or “chunks.” Bork is of the opinion that it doesn’t actually accomplish anything and is almost a complete waste of time and effort. Agile represents additional auditing that the customer asks for, but according to those I’ve spoken with, Lockheed is just as capable of chasing the latest pointless craze as the DoD. Even this can come down to customer ties. Sometimes, as Bork joked, the method they end up using might just come down to some salesman who had “a really really good golf game!” Here, Bork can be seen as implicitly blaming the capriciousness of sales and marketing personnel for wasting his time, those who he suggests deliberately engage in leisure activities and pretend it is work.

      Adding more time to any process typically gets more expensive, sometimes for the company and not only the customer. If you bid a “firm fixed price” on a contract, for instance, then any additional cost from extra testing is not covered, which means managers get nervous and executives get upset. This can be remedied if the process and product are of high enough quality that they draw the customer back again. But since this is no guarantee, it can create pressures to hurry up and finish things by scheduling deadlines and cost limitations.

      In some instances, however, wasted money and time are equated with doing things right and achieving greater public goods—durable products, predictable processes—even if they come at the sacrifice of private profit. If additional testing and risk analysis can waste money and time, some of my informants also associated it with higher moral ends, especially saving lives and the national defense. I should note that none of my informants offered this naturally in our conversations, but only after I asked more probing questions, arguably shifting their familiar way of assessing accountability and blame.18 But they did have responses. As Bork put it:

      You have lives of your countrymen that are reliant on what you’re doing. Just as if you were writing software or doing something with a jet airliner, you’ve got lives on the line of the people that are up in the air. So, depending on what you’re writing the software for, there can be a high degree of scrutiny on what you do. And the levels of testing get extremely expensive to guarantee as much as possible the reliability of that product.

      Bork said that the fact that lives depended on safety-critical products was not explicitly discussed often, but generally understood by all those at Lockheed. He added that this might be easy to forget because of a division of labor where they might only be testing a particular widget that goes in another machine. This not only means that more people in more states and more congressional districts have jobs (thereby making the military industrial complex possible); it also means that the product that will be consumed—and its potential destructiveness—is also removed from view. The symbolic distance between war front and home front is bolstered through a literal spatio-temporal difference, one that serves to alienate military manufacturers from the consequences of their actions. Instead, they are consumed with reflexive attention to their work performance.

      Bork said that engineers generally knew if some feature was safety-critical (e.g., for engine control), because it changed quality-testing standards during product development:

      If you have something that’s safety-critical, that word itself implies that people are going to die. So if you’re


Скачать книгу