When designing or constructing a product, we specify the requirements (and design features) in copious detail. But one set of items we tend to leave out are the unwanted requirements (or the do not do’s). This comment probably sounds odd (demented ?). However, we tend to focus on the positives in a design or in constructing or commissioning equipment. If you think of the recent Toyota recall, with the accelerator pedal operation which was not controllable, perhaps if the conceptual designers had included a note indicating items that cannot be permitted under any circumstances, things may have ended somewhat better. I would argue that in the earlier years in our profession (the 1800’s), we were quickly made aware of negative design effects as we were not that familiar with even obvious weaknesses in a design. One only needs to think of steam boilers, where explosions occurred with horrible regularity. As a result design improvements were introduced resulting in codes and standards for boiler specification and construction, and failures in this area dropped dramatically. However, the complexity of modern systems (esp. the computer-human interface area) makes picking up these type of problems considerably more difficult.
The other associated challenge is that we don’t appear to learn from our past mistakes and thus don’t remember them in putting a design specification together, again focussing on the positive attributes (rather than the do not do’s). Or we do know about past mistakes but believe we can circumvent or avoid the problems due to some successes in the past; hence do not worry about them in the specification. Or we are not entirely familiar with the real reasons for past failures so ignore the issue. One theory reckons that disasters in large systems occur with regular cyclical monotony. For example, one study revealed a 30-year gap between major bridge disasters in the USA. The suggestion was that this is perhaps due to a communication gap between one generation of engineers and the next. A provocative statement or not ?
So whilst focussing on the can-do aspects of your next design or construction task, take a moment to ponder and itemise the must-not-do’s under any circumstances. You won’t be popular with your boss but you may do the general public a great service and create a far safer and more functional design.
As a mild example of above: If only the guys who sold me the 30 x variable speed drives had warned me in their installation specification that the capacitors can easily fail and any possible sources of high voltage spikes would blow the capacitors (of all of them). In talking to others since then, I have come to realize that the failure of capacitors in variable speed drives have been a source of great pain for many others as well.
Thanks to the IEEE for an interesting set of articles on the topic.
As far as humans making mistakes, Albert Einstein, has a good comment:
Only two things are infinite, the universe and human stupidity, and I'm not sure about the former.
Yours in engineering learning