Dear Colleagues

A short <5 minute video on Electrical Power Quality, presented by the inimitable Terry Cousins (of TLC software), is up for your perusal: https://www.youtube.com/watch?v=LZqkD4syAO8. There will be more going up over the next few weeks (emphatically non-sales).

In response to my blog last week I am grateful to Dave Macdonald, our safety control systems expert (and author of three highly praised books on the topic), who has kindly put together the item below on systematic errors. It is worth reading if you are involved in designing control systems - for instrumentation, mechanical or electrical systems.

Some thoughts (by Dave) on Systematic Errors in Control Systems

If you build any sort of process or machine control system there is a very good chance that, when you first put it into service or test it, you will be plagued by some kind of ‘bug’, either in the wiring or in the software. The proper term for such bugs is: ‘Systematic Error in Design’.

Most of us are familiar with the routine of version upgrades or specification revisions. These are not big problems for a control system on a production process: The control engineer says, “Sorry, just hold on a few minutes, I will make a few quick changes to remedy the problem - I didn’t quite understand what you wanted first time round”.

But it’s a different story if the bug is dormant in a functional safety system, better known as a safety instrumented system (SIS) or emergency shutdown system. We can’t afford to have a wrong response just when the plant is about to explode! So it’s not surprising that SIS projects involve some heavy duty quality assurance work to try to minimize the chances of systematic errors creeping into the design. 

Some examples may help us to see the problem:

1 Faulty trip logic: An error in the trip logic diagram may not be revealed by testing the SIS as the faulty response will be built into the safety system. Testing the diagram with the (independent) process engineer might be a good idea here.
2 Failure to separate the safety sensor from the regular control sensor. Whatever goes wrong with the control loop will also afflict the safety function. We call this a common cause failure, but it originates from a systematic design error.
3 Incorrect installation of the trip sensor. The trip sensor will not correctly read the process condition.
4 Failure to consider all possibilities when scoping a safety function. See below for a notorious example from the London Underground.

Systematic Errors. When the mistake is built into the design through errors in understanding.

Train Safety…..?!!!

A London Underground train rolled backwards half a mile when the driver fell asleep, highlighting a serious flaw in signaling systems: they only work with trains going forward.
The Engineer: 14 July 2000


 

5 Safety controls often employ redundancy to ensure fault tolerance for random failures in the instruments. However, the benefits of doing this are significantly limited if the redundant instruments are identical - they may both suffer the same failure, for the same reason.

Here is a systematic failure example from my own experience: Two identical diaphragm seals, for pressure transmitters on a distillation column, failed at the same time when a severe vacuum occurred during a shutdown. The diaphragms were stretched leaving a 30% zero offset on both transmitters - not much help if one is for control and the other is for safety!

How can you avoid systematic errors?
• Be on the alert for common cause failure possibilities between control and safety instruments. Always look for diversity between instruments on the same application.
• Use the best quality assurance methods in hardware specification and in application software projects. Check back at each step forward in the project.
• Apply the safety life cycle guidelines of IEC 61511 and then find someone who is genuinely independent of your project to review the project stages for pitfalls.
• Strictly manage all design modifications and evaluate them for impact on the original safety requirements.

The more familiar you are with design projects, the more aware you will be of the potential for built in errors. Be alert!

Although the famous Economist, John Kenneth Galbraith, probably wasn’t thinking of systematic errors specifically, nonetheless this does apply: “If all else fails, immortality can always be assured by spectacular error”.

Thanks so much Dave Macdonald for your elegant dissertation above.

yours in engineering learning

Steve