Drugi jezik na kojem je dostupan ovaj članak: Bosnian
By: Janez Rakušček, Executive Creative Director, Luna\TBWA Ljubljana
At the height of World War II, in 1943, the war did not only involve millions of soldiers and war machines, but also complete teams of scientists and research departments at universities. Some areas that they dealt with were quite logical and understandable: these of course include chemistry, mechanical engineering, physics and the like, i.e. those areas of research that resulted in more powerful, more efficient and deadlier weapons. Among the experts who were looking for ways to gain greater efficiency for their armies was an incredible group of statistical mathematicians from various universities, who were faced with an interesting problem – challenging even for today. The problem was the following: the US Air Force wanted to increase the percentage of the ‘survival’ of airplanes flying over Germany, and were therefore considering where the planes needed stronger armor. But an airplane has this unfortunate characteristic – it can’t be completely covered with steel plates because it becomes too heavy, not agile enough and consumes too much fuel. On the other hand, if we make a fast and extremely agile aircraft, it becomes too vulnerable, due to its low level of protection. The secret thus lies in the right amount of protection in the right places on the plane.
The US Air Force tackled this search for a solution scientifically. They systematically counted the bullet holes that the returning planes had on them upon their return from campaigns. The surface of the plane was divided into four areas: engine, fuselage, fuel tank, and “the rest of the plane”. Statistics showed that the most holes by far were in the fuselage, and the least in the engine. Military experts immediately decided to steel plate the fuselage. It seems logical, but it’s not. No way. Why the decision is so unreasonable was explained to the officers by the secret Statistical Research Group – SRG, whose star was Dr. Abraham Wald, a professor at the Department of Applied Mathematics of Columbia University (otherwise an Austrian Jew, who studied mathematics at Vienna University, and emigrated to the United States fleeing the Nazis). He asked the generals a hypothetical question: Maybe the German bullets have this strange characteristic that they more often hit the hull than the engine, but the answer lies elsewhere – for example in the fact that planes that got hit in the engine never made it back to the base? The answer is therefore contrary to intuitive logic: you put armor plating where there are no bullet holes.*
You’re probably wondering now what old war stories have to do with modern communications. The answer is simple: surprisingly, a lot. We work in an environment which, on the one hand, requires a continual reexamination of borders, tapping into the unknown, educated guessing, and, on the other hand, requires ever more measurable efficiency. The tension between the desire for innovation and the requirements for reliability is bringing the problems of organizational culture into the creative industry. This is exactly what Amy Edmondson, Novartis professor of leadership and management at Harvard Business School, deals with in her research. I had the honor to meet and listen to her in New York. Amy is an expert in creating successful, target-oriented groups in creative environments and (among other things) is the author of Teaming: How Organizations Learn, Innovate, and Compete in the Knowledge Economy. In her practice-oriented research and teaching, Professor Edmondson, among other things, identifies two very important fields that are enabled by the creative team leader.
These are creating a climate of psychological safety and promoting intelligent failure. In fact three types of mistakes occur in companies and organizations: first are the failures that we should avoid because our knowledge should enable it (this might include the problem with the bullet holes on military aircraft). The next are complex failures in which a series of complicated internal and external factors create such conditions that, in spite of the known circumstances, failures occur. Third, the most important and the most useful type of errors are the so-called “intelligent failures” which are defined as “unwanted results of thought-out steps into new areas.” The first, of course, should be avoided by all means, the second type of failures should be predicted and limited, while intelligent failures deserve support and encouragement. As the inventor of the light bulb, Thomas Edison, said: “I haven’t failed, I’ve just found 10,000 ways that won’t work.” The advice of Amy Edmondson is surprisingly similar to the advice of Slavoj Žižek (or the title of his textbook): “Fail better” says Edmondson, while Žižek titled his book “Try Again – Fail Better”; which is, of course, a paraphrase of the famous sentence by the playwright Samuel Beckett.
Next time you think about a new campaign and the use of new, unknown and risky methods, try for starters to apply the formula of Amy Edmondson. If your idea goes against your experiences (and some old wisdom), it’s good to take some time for further reflection. If in theory the thing works, and the only thing is that you simply can’t predict all the consequences, then it’s worth a try. Even if you fail, you will fail well. And no one should hold it against you.
*The story is an excerpt from a very entertaining and useful book, How Not To Be Wrong – The Power of Mathematical Thinking, written by genius mathematician Jordan Ellenberg, and published by Penguin Books in 2014.