A classic example comes from the 1940s where there was a series of seemingly inexplicable accidents involving B-17 bombers. Pilots were pressing the wrong switches. Instead of pressing the switch to lift the flaps, they were pressing the switch to lift the landing gear. Should they have been penalised? Or censured? The industry commissioned an investigator to probe deeper. He found that the two switches were identical and side by side. Under the pressure of a difficult landing, pilots were pressing the wrong switch. It was an error trap, an indication that human error often emerges from deeper systemic factors. The industry responded not by sacking the pilots but by attaching a rubber wheel to the landing-gear switch and a small flap shape to the flaps control. The buttons now had an intuitive meaning, easily identified under pressure. Accidents of this kind disappeared overnight.