If you want to know the consequences of tech’s gender gap, look no further than Britain after World War II, says Marie Hicks, assistant history professor at Illinois Institute of Technology.
Hicks is the author of “Programmed Inequality,” a book just published from MIT Press, that explores why Britain’s computing industry, which was among the best in the world after World War II, was nearly gone by the 1970s. Hicks argues that discriminatory hiring practices against women after World War II, who were a critical part of the computing’s industry’s labor force during and after the war, crippled innovation in the public and private sector.
While this is an example from across the pond, Hicks notes this story can shed light on conversations around diversity in tech in the United States and around the world: What are the long term consequences of having a tech industry that doesn’t draw talent from all available pools?
In this interview with Chicago Inno, Hicks talks how Britain created a false labor shortage after the war, how computing became a male-dominated field, and the importance of studying the history of tech to better understand its future.
Chicago Inno: Let’s go back to 1944 when Britain “led the world in electronic computing.” Why was Britain so ahead of the game?
Marie Hicks: Britain was ahead of the game in part because they were desperate. They were being destroyed by German bombers, starved by German submarines, and they feared a ground invasion would soon follow. That is the context in which they created the world’s first digital electronic programmable computer for codebreaking. And not only did they create it, but they used it successfully. It allowed them to break enemy codes as fast as possible. By the war’s end, there were ten of these huge electronic Colossus computers decrypting intercepted German messages. The intelligence these computers—which were run by women—provided for the Allies ensured the success of the D-Day landings. It was thanks to the decoding done by Colossus I and II that the Allied forces knew where to land on D-Day.
Meanwhile, United States had the luxury of taking its time with computing during WWII. The US was using early computers to try to improve weapons’ targeting accuracy by making math-intensive ballistics tables. Not only did these computers not provide information that measurably changed the course of the war, but the US didn’t even have an electronic computer during WWII. The Mark 1, which was used during the war and made famous by pioneer programmer Grace Hopper, was a very slow, electromechanical machine. The most important thing it did, in the end, was provide a place where Hopper and others learned to program. The ENIAC, which was electronic, was not operational until after WWII ended.
However, over the next 30 years or so, the computer industry in Britain largely went away, and you argue that this had to do with “labor feminization and gendered technocracy.” Can you explain what you mean by this?
All technology is applied. Since technologies don’t exist in a vacuum, they rely on how people use them in order to succeed or fail. Simply put, computers need people. And early computers needed people all the more: They required an enormous amount of labor and expertise to program them and to apply their power to the various problems that industry and government were trying to solve. Early on, the people providing almost all of this technical labor were women, because (perhaps surprisingly to modern ears) technical work was seen as rote and deskilled. No one who could do supposedly more “intellectual” or white collar work wanted to work with machines in this way. So the labor that made computers run—and succeed—was feminized.
But as people started to understand how powerful and important this new technology was, there was a concerted effort to push women out of computing jobs and replace them with men, specifically men who were management-level or who were seen as “management material.” This meant that the government, which was the largest computer user for much of this period, created an artificial labor shortage. It wouldn’t hire women for these newly-important jobs, but the men it wanted to hire simply didn’t have the skills or the interest in the work.
There was a concerted effort to push women out of computing jobs and replace them with men.
So the British cut the legs out from under all of their computerization projects by destroying the human side of their computing systems. In addition, since the government was such a large computer user it demanded that the British computer industry make certain kinds of computers that were geared towards solving this labor shortage, but that hurt the computer industry in the end. The government also trained many of Britain’s computer professionals, who then went out into industry. Dame Stephanie Shirley is a great example of a computer pioneer who got her start in the government and then went off to found a multi-billion dollar software company. But Shirley was an exception who succeeded despite how she was treated by her employers. The talents of many other women thrown away by the government were simply lost. This meant the government’s actions didn’t just hurt the public sector; they hurt the private sector as well.
The gender ratio in computer science also shifted in the United States around the same time–why didn’t the US see the same slowdown in the computing industry?
The US had a far larger population and therefore a larger labor force, and this insulated it from the effects of its own discriminatory labor practices somewhat. Nonetheless, the US had a hard time competing with a country that had been absolutely ravaged by World War II: the USSR. The US had been left relatively untouched by the war in terms of its physical infrastructure and had lost far fewer lives to the fighting. But the USSR stayed in the game in part because it leveraged its labor force better. For instance, they used both men and women in their Space program, and they beat us in the early stages of the Space Race, which shocked the US.
So maybe our labor force mismanagement in that period didn’t slow us down to the point of outright decline, but given all of our advantages, and the vast size of our workforce, shouldn’t we have done a heck of a lot better, relatively speaking? Instead, we had to fight tooth and nail to pull out a late win in the Space Race with the moon landing, after the Soviets had already launched the first satellite and put the first person into orbit.
Do you see the problems that Britain faced replicated in the US or elsewhere today?
Yes, very much so. The problems in the UK caused a powerful nation to decline precipitously because the UK discriminated against women workers in ways that were very far-reaching. Many thought women weren’t good enough to hold the jobs that they had already proven themselves capable of doing, and so people installed incompetent men in those jobs instead. Perhaps this sounds familiar. We have recently had an election in the US, for instance, that did just that same thing.
The US is in the midst of extreme turmoil right now, and our nation is caught in a similar downward spiral in terms of international prestige. We are seeing firsthand the terrible effect that discrimination can have on societies and economies. Along these lines, I think it’s important to point out…how structural discrimination works more broadly—it isn’t contained just within specific industries or sectors of the economy. This is why it’s so dangerous— to everybody, not just those whom it affects most directly.
There’s still a dearth of women in computer science in the US–what does that mean for US competitiveness in technology and innovation as we move forward?
Nothing good. On the national level it means we are hurting ourselves by not maximizing the talents of our labor force. Women, people of color, and many other groups still are arbitrarily left out (or pushed out) of technical jobs and especially more powerful technical roles. The lower number of positions of power held by women in Silicon Valley, and the far lower amounts of venture capital they’re able to raise, mean women are able to do less and be less, and that hurts us all. It also means, at the other end of the spectrum, that more women in our society earn less than they should. It is not a coincidence that there are more women living in poverty.
The utopianism and meritocracy that Silicon Valley likes to believe in isn’t just false, it’s actually the exact opposite of the future we’re headed towards. More and more, Silicon Valley is focusing on fixing non-problems while leaving the really big, difficult problems unsolved, because those aren’t as lucrative in the short term. But it’s important to remember that civil rights, just like technology, are something that require sustained investment. And civil rights, just like technology, are an absolute necessity for a prosperous future.
We’ve started to see a number of previously untold stories come out about the intersection of gender, race, and the history of technology, most notably with the recent book and film “Hidden Figures.” What does telling these stories do for our tech industry today?
They show that not only is there another way, but that things have been more inclusive, in certain senses, in the past. And, without all of these women—black and white—there would be far fewer successes in computing for us to celebrate, and far fewer successes for the small number of people at the top of these industries or government projects to take credit for.
Note: This interview was edited for length and clarity.