When Iris Bohnet talks about her work, it is almost always about curtain, orchestra and equality. What do these three things have to do with each other? Quite simply: since candidates from some well-known American orchestras audition behind a curtain, orchestras are hiring many more women. Studies have shown that just one piece of material gave women more equality. What looks a little like the talent show “The Voice” is proof of that: it’s not the people who have to change, for example the more masculine-looking women wearing pantsuits – but the rules of the game.
Iris Bohnet mentions this example not only in her book “What works” from 2017, but also on Tuesday in Berlin during the “Plan W” congress of the SZ. “The world can’t get by with only half the talent pool,” says the behavioral economist there, connected by camera from a desk with a comfortable turquoise seating area in the background. And you can’t just throw money at the problem and wait and see what happens.
You could say that Bohnet is used to talking: she quotes various studies, replies with confidence and routine, sometimes interspersed with English technical terms. In between, she apologizes that her standard German was a bit rusty. No wonder, after all, that the Swiss by birth lived in the United States for many years. There she began as an assistant professor at Harvard University in 1998, and is now among other academic dean of the Harvard Kennedy School. He mainly focuses on behavioral economics. She has devoted half of her professional life to the question of how to bring more women and blacks into leadership positions; it wants to make the world of work more diverse. “Let us fight systemic prejudices with systemic solutions”, is his call this Tuesday in Berlin.
Her “systematic solution,” as she calls it, is above all a technology that many would not necessarily associate with equality: artificial intelligence. Because the advantage of what is called AI is that the “bias”, that is, the prejudices that many people have consciously or unconsciously embedded in their heads, can simply be programmed. She is not a blind advocate of this technology, there are also “bad algorithms” that make life even more difficult for people. But there are certainly some useful algorithms.
Right now, for example, she was working on a letter of recommendation that she would later cross-check using an algorithm. This would then make it possible to test whether certain groups would be disadvantaged by prejudice, for example by focusing on the character traits of women rather than their skills. In job interviews, too, there are plenty of ways to remove your own prejudices against certain people, says Bohnet. For example, asking all candidates the same questions in the same order. Only then would the skills have more space. When is there still a need for a “curtain” in the world of work? Clearly with the photos, which are still an integral part of many candidate files in this country. “Sometimes we need curtains, but in other cases we also need models”, explains the researcher.