Almost immediately after the gas alarms sounded, there were two explosions. Flames roared into the night sky as an inferno swallowed the rig. Dozens of dazed, injured people converged on the lifeboats. Some workers reacted to the chaos and panic by jumping 125 feet into the sea.
The Deepwater Horizon burned and sank, resulting in a massive offshore Gulf of Mexico oil spill, the U.S.'s largest environmental disaster. Eleven workers died. More were injured.
Researchers identified multiple overlooked warning signs that, had they been identified earlier, would have helped BP and Transocean avoid this tragedy. What can other management teams and companies learn so they can avoid their own tragedies?
Bad news is good
According to Andrew Hopkins, author of "Disastrous Decisions," one of the most important reasons for the Deepwater disaster was "an attitude on the part of senior management that discourages the reporting of bad news." Leaders need to realize bad news is good. Organizations that want to avoid similar tragedies need to encourage employees to send problems upward so leaders can take necessary action before it becomes too late.
How can managers encourage employees to let leaders know of potential problems? In the safety arena, simple questions work well. Two of the best are "Do you see safety problems we aren't adequately handling?" and "When do you feel forced to take shortcuts in safety?"
"Group think" and "risky shift" edged the Deepwater toward disaster one step at a time -- and may equally infect many management teams.
"Group think" refers to the phenomenon in which those who doubt the majority's dominant view hesitate to voice concerns because they don't want to rock the boat. "Risky shift" explains why groups often make high-risk decisions that individual group members would avoid. Why? When "everyone" makes decisions, it diffuses responsibility to the extent that no individual takes accountability.
Multiple researchers also credit group think with creating the Challenger space shuttle explosion. While the Challenger explosion's direct cause was faulty O-rings, researchers agree the Challenger presented examples of both group think and an organizational culture that discouraged upward communications from employees to leaders.
When four managers were tasked with deciding whether to make the launch despite engineer warnings about the risks created by a cold temperature launch, three managers voted to launch and one remained unsure.
When the group leader told the manager cautioning against the launch it was "time to take off his engineering hat and put on his management hat," he caved.
What happens when individuals hesitate to voice concerns or challenge problem decisions and when groups take risks individuals would wisely avoid?
If group think or risky shift infect your management team or department, ask: "How do we treat dissenting voices?" and "Do we feel we're collectively on the line for all decisions?"
We don't really monitor -- we just say we did
In both the Deepwater Horizon and the Challenger space shuttle launch disaster, managers and employees saw "without seeing" warning signs. For example, when teams noticed well testing wasn't proceeding as expected, they found a way to explain the problem away without questioning the well's integrity.
In the same way, managers and employees didn't test to investigate if the well was effectively sealed but to confirm it was. Why? They'd never seen a well test fail, so they couldn't conceive that one would. They thus sought out information that confirmed what they knew rather than what might challenge their pre-existing beliefs.
Further, managers monitoring the ship's activity acknowledged that management walk-throughs were more for visibility than for actual monitoring because they "didn't want to interfere" with the workers.
Meanwhile, the drillers stopped monitoring because "as far as they were concerned the job was over" once they finished their discreet task.
BP officials also acknowledged they erroneously believed senior Transocean employees had the ability to properly interpret tests. Transocean didn't share that view but noted its employees hadn't been trained to think in the abstract terms needed to effectively interpret the tests.
What lessons can we learn? We need to accept that bad news is good and we need to turn fresh eyes on our decision-making and monitoring processes.
Dr. Lynne Curry is a management/employee trainer and owner of the consulting firm The Growth Company Inc. Send your questions to her at lynne@ thegrowthcompany.com