Renowned scientist Richard Feynman outlined cargo cults in his book “Surely You’re Joking, Mr. Feynman!”. His intent was to lampoon the efforts of poor scientific practice:
In the South Seas there is a cargo cult of people. During the war they saw airplanes land with lots of good materials, and they want the same thing to happen now. So they've arranged to imitate things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head like headphones and bars of bamboo sticking out like antennas—he's the controller—and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential, because the planes don't land.
In wanting to be relevant and -more importantly- published, scientists were building on the published work of others not verified by themselves or anyone else. Inherent bias in the original experimenters leads to erroneous or downright fictional results. Feynman saw the potential issues in assuming the validity of published science all the way back in 1985, and was subsequently vindicated by the emergence of the replication crisis (in which only 36% of textbook level psychology research was found to be replicable), and the ensuing distrust of experts.
The usefulness as an analogy does not end with Science, and the same intuitive trust of previous success stories can lead to the same thinking in business. After the explosive success of google stock in the early 21st Century, the number of businesses attempting to emulate google’s working practices was similarly explosive. The experience of the startup darling ‘Zenefits’ echoes cargo-cultism at it’s finest: in a bid to be the next google, an uber- relaxed working culture was implemented resulting in heavy drinking during the day and illicit sex in the stairwells of the offices.
For the same reason it happens in science: bias. Given a stressful situation we will readily recall the most prominent and newsworthy examples of successful business to compare ourselves to. This is called ‘availability heuristic’ and is the same mental shortcut that leads us to believe that air travel is inherently unsafe because we can remember a recent plane crash, or that Nigel Farage’s opinions are worth listening to as he is frequently on Question Time Numerous other biases such as overconfidence and optimism are also culpable.
“So what?” you must be thinking: “I’m aware of these biases and therefore I will not fall into this trap!”. Bad luck, because awareness is not enough, with training in many cases causing an increase in the bad behaviour you wanted to avoid, as outlined in an American government drug abuse report from 2004: “Pursuing a vigorous advertising and public communications program dealing with the dangers of drug use by youth,” was a complete flop — to the extent of affecting kids’ behavior, it made them more likely to smoke weed or view doing so as favorable.” See also: awareness campaigns of the negative effects of: smoking, eating junk food, underage sex, bullying, list-making.
Don’t panic: despite the failings of the American government to stamp out drugs, diabetes and obesity with awareness campaigns, I think you are going to be just fine. There are a number of ways to mitigate your biases and shortcomings: