ISO Withdraws Machinery Risk Assessment Standards

This entry is part 1 of 8 in the series Risk Assessment

ISO has with­drawn three long-​standing basic machinery safety stand­ards used inter­na­tion­ally and in the EU and replaced them with a single com­bined doc­u­ment. If you design, build or integ­rate machinery for sale inter­na­tion­ally or with­in the EU, this new stand­ard needs to be on your BUY list!

ISO has with­drawn three long-​standing basic machinery safety stand­ards used inter­na­tion­ally and in the EU and replaced them with a single com­bined doc­u­ment. If you design, build or integ­rate machinery for sale inter­na­tion­ally or with­in the EU, this new stand­ard needs to be on your BUY list!

ISO 14121 – 1 Withdrawn, along with ISO 12100 – 1 and -2

As of 20-​Oct-​2010 three stand­ards, ISO 14121 – 1, Safety of Machinery – Risk Assessment – Part 1: Principles, ISO 12100 – 1, Safety of machinery – Basic con­cepts, gen­er­al prin­ciples for design – Part 1: Basic ter­min­o­logy and meth­od­o­logy and ISO 12100 – 2, Safety of machinery – Basic con­cepts, gen­er­al prin­ciples for design – Part 2: Technical prin­ciples, have been replaced by the new ISO 12100:2010, Safety of machinery – General prin­ciples for design – Risk assess­ment and risk reduc­tion blends togeth­er three fun­da­ment­al Type A machinery stand­ards into one coher­ent whole. This import­ant new doc­u­ment means that machinery design­ers have the fun­da­ment­al design require­ments for all machinery in one stand­ard. The only excep­tion is now ISO/​TR 14121 – 2:2007, Safety of machinery — Risk assess­ment — Part 2: Practical guid­ance and examples of meth­ods. This Technical Report stands as guid­ance for risk assess­ment and provides a num­ber of examples of the dif­fer­ent meth­ods used to assess machinery risk.

Abstract

This abstract is taken from the ISO web cata­logue page for the new stand­ard.

ISO 12100:2010 spe­cifies basic ter­min­o­logy, prin­ciples and a meth­od­o­logy for achiev­ing safety in the design of machinery. It spe­cifies prin­ciples of risk assess­ment and risk reduc­tion to help design­ers in achiev­ing this object­ive. These prin­ciples are based on know­ledge and exper­i­ence of the design, use, incid­ents, acci­dents and risks asso­ci­ated with machinery. Procedures are described for identi­fy­ing haz­ards and estim­at­ing and eval­u­at­ing risks dur­ing rel­ev­ant phases of the machine life cycle, and for the elim­in­a­tion of haz­ards or suf­fi­cient risk reduc­tion. Guidance is giv­en on the doc­u­ment­a­tion and veri­fic­a­tion of the risk assess­ment and risk reduc­tion pro­cess.

ISO 12100:2010 is also inten­ded to be used as a basis for the pre­par­a­tion of type-​B or type-​C safety stand­ards.

It does not deal with risk and/​or dam­age to domest­ic anim­als, prop­erty or the envir­on­ment.

Table of Contents

Here is the table of con­tents from the stand­ard as pub­lished.

Foreword

Introduction

1 Scope

2 Normative ref­er­ences

3 Terms and defin­i­tions

4 Strategy for risk assess­ment and risk reduc­tion

5 Risk assess­ment

5.1 General

5.2 Information for risk assess­ment

5.3 Determination of lim­its of machinery

5.3.1 General

5.3.2 Use lim­its

5.3.3 Space lim­its

5.3.4 Time lim­its

5.3.5 Other lim­its

5.4 Hazard iden­ti­fic­a­tion

5.5 Risk estim­a­tion

5.5.1 General

5.5.2 Elements of risk

5.5.3 Aspects to be con­sidered dur­ing risk estim­a­tion

5.6 Risk eval­u­ation

5.6.1 General

5.6.2 Adequate risk reduc­tion

5.6.3 Comparison of risks

6 Risk reduc­tion

6.1 General

6.2 Inherently safe design meas­ures

6.2.1 General

6.2.2 Consideration of geo­met­ric­al factors and phys­ic­al aspects

6.2.3 Taking into account gen­er­al tech­nic­al know­ledge of machine design

6.2.4 Choice of appro­pri­ate tech­no­logy

6.2.5 Applying prin­ciple of pos­it­ive mech­an­ic­al action

6.2.6 Provisions for sta­bil­ity

6.2.7 Provisions for main­tain­ab­il­ity

6.2.8 Observing ergo­nom­ic prin­ciples

6.2.9 Electrical haz­ards

6.2.10 Pneumatic and hydraul­ic haz­ards

6.2.11Applying inher­ently safe design meas­ures to con­trol sys­tems

6.2.12 Minimizing prob­ab­il­ity of fail­ure of safety func­tions

6.2.13 Limiting expos­ure to haz­ards through reli­ab­il­ity of equip­ment

6.2.14 Limiting expos­ure to haz­ards through mech­an­iz­a­tion or auto­ma­tion of load­ing (feed­ing) /​ unload­ing (remov­al) oper­a­tions

6.2.15 Limiting expos­ure to haz­ards through loc­a­tion of set­ting and main­ten­ance points out­side danger zones

6.3 Safeguarding and com­ple­ment­ary pro­tect­ive meas­ures

6.3.1 General

6.3.2 Selection and imple­ment­a­tion of guards and pro­tect­ive devices

6.3.3 Requirements for design of guards and pro­tect­ive devices

6.3.4 Safeguarding to reduce emis­sions

6.3.5 Complementary pro­tect­ive meas­ures

6.4 Information for use

6.4.1 General require­ments

6.4.2 Location and nature of inform­a­tion for use

6.4.3 Signals and warn­ing devices

6.4.4 Markings, signs (pic­to­grams) and writ­ten warn­ings

6.4.5 Accompanying doc­u­ments (in par­tic­u­lar – instruc­tion hand­book)

7 Documentation of risk assess­ment and risk reduc­tion

Annex A (inform­at­ive) Schematic rep­res­ent­a­tion of a machine

Annex B (inform­at­ive) Examples of haz­ards, haz­ard­ous situ­ations and haz­ard­ous events

Annex C (inform­at­ive) Trilingual look­up and index of spe­cif­ic terms and expres­sions used in ISO 12100

Bibliography

Buying Advice

This is a sig­ni­fic­ant change in these three stand­ards. Revision to the text of the stand­ards was sig­ni­fic­ant, at least from the per­spect­ive that the mater­i­al has been re-​organized into a single, coher­ent doc­u­ment. If you are basing a CE Mark on these stand­ards, you should strongly con­sider pur­chas­ing the har­mon­ized ver­sion when it becomes avail­able at your favour­ite retail­er. The ISO ver­sion is avail­able now in English and French as a hard copy or pdf doc­u­ment, priced at 180 CHF (Swiss Francs), or about CA$175.

As of this writ­ing, CEN has adop­ted EN ISO 12100:2010, with a pub­lished “dow” (date of with­draw­al) of 30-​Nov-​2013. The “doc” (date of ces­sa­tion) will be pub­lished in a future list of har­mon­ized stand­ards in the Official Journal of the European Union under the Machinery Directive 2006/​42/​EC.

My recom­mend­a­tion is to BUY this stand­ard if you are a machine build­er. If you are CE mark­ing your product you may want to wait until the har­mon­ized edi­tion is pub­lished, how­ever, it is worth know­ing that tech­nic­al changes to the norm­at­ive con­tent of the stand­ard are very unlikely when har­mon­iz­a­tion occurs.

How Risk Assessment Fails

Fukushima Dai Ichi Power Plant after the explosionsThe events unfold­ing at Japan’s Fukushima Dai Ichi Nuclear Power plant are a case study in ways that the risk assess­ment pro­cess can fail or be abused. In an art­icle pub­lished on Bloomberg​.com, Jason Clenfield item­izes dec­ades of fraud and fail­ures in engin­eer­ing and admin­is­tra­tion that have led to the cata­stroph­ic fail­ure of four of six react­ors at the 40-​year-​old Fukushima plant. Clenfield’s art­icle, ‘Disaster Caps Faked Reports’, goes on to cov­er sim­il­ar fail­ures in the Japanese nuc­le­ar sec­tor.

Most people believe that the more ser­i­ous the pub­lic danger, the more care­fully the risks are con­sidered in the design and exe­cu­tion of pro­jects like the Fukushima plant. Clenfield’s art­icle points to fail­ures by a num­ber of major inter­na­tion­al busi­nesses involved in the design and man­u­fac­ture of com­pon­ents for these react­ors that may have con­trib­uted to the cata­strophe play­ing out in Japan. In some cases, the cor­rect actions could have bank­rup­ted the com­pan­ies involved, so rather than risk fin­an­cial fail­ure, these fail­ures were covered up and the work­ers involved rewar­ded for their efforts. As you will see, some­times the degree of care that we have a right to expect is not the level of care that is used.

How does this relate to the fail­ure and abuse of the risk assess­ment pro­cess? Read on!

Risk Assessment Failures

Earthquake and Tsunami damage - Fukushima Dai Ichi Power PlantThe Fukushima Dai Ichi nuc­le­ar plant was con­struc­ted in the late 1960’s and early 1970’s, with Reactor #1 going on-​line in 1971. The react­ors at this facil­ity use ‘act­ive cool­ing’, requir­ing elec­tric­ally powered cool­ing pumps to run con­tinu­ously to keep the core tem­per­at­ures in the nor­mal oper­at­ing range. As you will have seen in recent news reports, the plant is loc­ated on the shore, draw­ing water dir­ectly from the Pacific Ocean.

Learn more about Boiling Water Reactors used at Fukushima.

Read IEEE Spectrum’s “24-​Hours at Fukushima”, a blow-​by-​blow account of the first 24 hours of the dis­aster.

Japan is loc­ated along one of the most act­ive fault lines in the world, with plate sub­duc­tion rates exceed­ing 90 mm/​year. Earthquakes are so com­mon­place in this area that the Japanese people con­sider Japan to be the ‘land of earth­quakes’, start­ing earth­quake safety train­ing in kinder­garten.

Japan is the county that cre­ated the word ‘tsunami’ because the effects of sub-​sea earth­quakes often include large waves that swamp the shoreline. These waves affect all coun­tries bor­der­ing the worlds oceans, but are espe­cially pre­val­ent where strong earth­quakes are fre­quent.

In this envir­on­ment it would be reas­on­able to expect that con­sid­er­a­tion of earth­quake and tsunami effects would mer­it the highest con­sid­er­a­tion when assess­ing the risks related to these haz­ards. Remembering that risk is a func­tion of sever­ity of con­sequence and prob­ab­il­ity, the risk assessed from earth­quake and tsunami should have been crit­ic­al. Loss of cool­ing can res­ult in the cata­stroph­ic over­heat­ing of the react­or core, poten­tially lead­ing to a core melt­down.

The Fukushima Dai Ichi plant was designed to with­stand 5.7 m tsunami waves, even though a 6.4 m wave had hit the shore close by 10 years before the plant went on-​line. The wave gen­er­ated by the recent earth­quake was 7 m. Although the plant was not washed away by the tsunami, the wave cre­ated anoth­er prob­lem.

Now con­sider that the react­ors require con­stant forced cool­ing using elec­tric­ally powered pumps. The backup gen­er­at­ors installed to ensure that cool­ing pumps remain oper­a­tion­al even if the mains power to the plant is lost, are installed in a base­ment sub­ject to flood­ing. When the tsunami hit the sea­wall and spilled over the top, the flood­wa­ters poured into the backup gen­er­at­or room, knock­ing out the dies­el backup gen­er­at­ors. The cool­ing sys­tem stopped. With no power to run the pumps, the react­or cores began to over­heat. Although the react­ors sur­vived the earth­quakes and the tsunami, without power to run the pumps the plant was in trouble.

Learn more about the acci­dent.

Clearly there was a fail­ure of reas­on when assess­ing the risks related the loss of cool­ing cap­ab­il­ity in these react­ors. With sys­tems that are mis­sion crit­ic­al in the way that these sys­tems are, mul­tiple levels of redund­ancy bey­ond a single backup sys­tem are often the min­im­um required.

In anoth­er plant in Japan, a sec­tion of pip­ing car­ry­ing super­heated steam from the react­or to the tur­bines rup­tured injur­ing a num­ber of work­ers. The pipe was installed when the plant was new and had nev­er been inspec­ted since install­a­tion because it was left off the safety inspec­tion check­list. This is an example of a fail­ure that res­ul­ted from blindly fol­low­ing a check­list without look­ing at the lar­ger pic­ture. There can be no doubt that someone at the plant noticed that oth­er pipe sec­tions were inspec­ted reg­u­larly, but that this par­tic­u­lar sec­tion was skipped, yet no changes in the pro­cess res­ul­ted.

Here again, the risk was not recog­nized even though it was clearly under­stood with respect to oth­er sec­tions of pipe in the same plant.

In anoth­er situ­ation at a nuc­le­ar plant in Japan, drains inside the con­tain­ment area of a react­or were not plugged at the end of the install­a­tion pro­cess. As a res­ult, a small spill of radio­act­ive water was released into the sea instead of being prop­erly con­tained and cleaned up. The risk was well under­stood, but the con­trol pro­ced­ure for this risk was not imple­men­ted.

Finally, the Kashiwazaki Kariwa plant was con­struc­ted along a major fault line. The design­ers used fig­ures for the max­im­um seis­mic accel­er­a­tion that were three times lower than the accel­er­a­tions that could be cre­ated by the fault. Regulators per­mit­ted the plant to be built even though the rel­at­ive weak­ness of the design was known.

Failure Modes

I believe that there are a num­ber of reas­ons why these kinds of fail­ures occur.

People have a dif­fi­cult time appre­ci­at­ing the mean­ing of prob­ab­il­ity. Probability is a key factor in determ­in­ing the degree of risk from any haz­ard, yet when fig­ures like ‘1 in 1000’ or ‘1 x 10-5 occur­rences per year’ are dis­cussed, it’s hard for people to truly grasp what these num­bers mean. Likewise, when more sub­ject­ive scales are used it can be dif­fi­cult to really under­stand what ‘likely’ or ‘rarely’ actu­ally mean.

Consequently, even in cases where the sever­ity may be very high, the risk related to a par­tic­u­lar haz­ard may be neg­lected because the risk is deemed to be low because the prob­ab­il­ity seems to be low.

When prob­ab­il­ity is dis­cussed in terms of time, a fig­ure like ‘1 x 10-5 occur­rences per year’ can make the chance of an occur­rence seem dis­tant, and there­fore less of a con­cern.

Most risk assess­ment approaches deal with haz­ards singly. This is done to sim­pli­fy the assess­ment pro­cess, but the prob­lem that can res­ult from this approach is the effect that mul­tiple fail­ures can cre­ate, or that cas­cad­ing fail­ures can cre­ate. In a mul­tiple fail­ure con­di­tion, sev­er­al pro­tect­ive meas­ures fail sim­ul­tan­eously from a single cause (some­times called Common Cause Failure). In this case, back-​up meas­ures may fail from the same cause, res­ult­ing in no pro­tec­tion from the haz­ard.

In a cas­cad­ing fail­ure, an ini­tial fail­ure is fol­lowed by a series of fail­ures res­ult­ing in the par­tial or com­plete loss of the pro­tect­ive meas­ures, res­ult­ing in par­tial or com­plete expos­ure to the haz­ard. Reasonably fore­see­able com­bin­a­tions of fail­ure modes in mis­sion crit­ic­al sys­tems must be con­sidered and the prob­ab­il­ity of each estim­ated.

Combination of haz­ards can res­ult in syn­ergy between the haz­ards res­ult­ing in a high­er level of sever­ity from the com­bin­a­tion than is present from any one of the haz­ards taken singly. Reasonably fore­see­able com­bin­a­tions of haz­ards and their poten­tial syn­er­gies must be iden­ti­fied and the risk estim­ated.

Oversimplification of the haz­ard iden­ti­fic­a­tion and ana­lys­is pro­cesses can res­ult in over­look­ing haz­ards or under­es­tim­at­ing the risk.

Thinking about the Fukushima Dai Ichi plant again, the com­bin­a­tion of the effects of the earth­quake on the plant, with the added impact of the tsunami wave, res­ul­ted in the loss of primary power to the plant fol­lowed by the loss of backup power from the backup gen­er­at­ors, and the sub­sequent par­tial melt­downs and explo­sions at the plant. This com­bin­a­tion of earth­quake and tsunami was well known, not some ‘unima­gin­able’ or ‘unfore­see­able’ situ­ation. When con­duct­ing risk assess­ments, all reas­on­ably fore­see­able com­bin­a­tions of haz­ards must be con­sidered.

Abuse and neglect

The risk assess­ment pro­cess is sub­ject to abuse and neg­lect. Risk assess­ment has been used by some as a means to jus­ti­fy expos­ing work­ers and the pub­lic to risks that should not have been per­mit­ted. Skewing the res­ults of the risk assess­ment, either by under­es­tim­at­ing the risk ini­tially, or by over­es­tim­at­ing the effect­ive­ness and reli­ab­il­ity of con­trol meas­ures can lead to this situ­ation. Decisions relat­ing to the ‘tol­er­ab­il­ity’ or the ‘accept­ab­il­ity’ of risks when the sever­ity of the poten­tial con­sequences are high should be approached with great cau­tion. In my opin­ion, unless you are per­son­ally will­ing to take the risk you are pro­pos­ing to accept, it can­not be con­sidered either tol­er­able or accept­able, regard­less of the leg­al lim­its that may exist.

In the case of the Japanese nuc­le­ar plants, the oper­at­ors have pub­licly admit­ted to falsi­fy­ing inspec­tion and repair records, some of which have res­ul­ted in acci­dents and fatal­it­ies.

In 1990, the US Nuclear Regulatory Commission wrote a report on the Fukushima Dai Ichi plant that pre­dicted the exact scen­ario that res­ul­ted in the cur­rent crisis. These find­ings were shared with the Japanese author­it­ies and the oper­at­ors, but no one in a pos­i­tion of author­ity took the find­ings ser­i­ously enough to do any­thing. Relatively simple and low-​cost pro­tect­ive meas­ures, like increas­ing the height of the pro­tect­ive sea wall along the coast­line and mov­ing the backup gen­er­at­ors to high ground could have pre­ven­ted a nation­al cata­strophe and the com­plete loss of the plant.

A Useful Tool

Despite these human fail­ings, I believe that risk assess­ment is an import­ant tool. Increasingly soph­ist­ic­ated tech­no­logy has rendered ‘com­mon sense’ use­less in many cases, because people do not have the expert­ise to have any com­mon sense about the haz­ards related to these tech­no­lo­gies.

Where haz­ards are well under­stood, they should be con­trolled with the simplest, most dir­ect and effect­ive meas­ures avail­able. In many cases this can be done by the people who first identi­fy the haz­ard.

Where haz­ards are not well under­stood, bring­ing in experts with the know­ledge to assess the risk and imple­ment appro­pri­ate pro­tect­ive meas­ures is the right approach.

The com­mon aspect in all of this is the iden­ti­fic­a­tion of haz­ards and the applic­a­tion of some sort of con­trol meas­ures. Risk assess­ment should not be neg­lected simply because it is some­times dif­fi­cult, or it can be done poorly, or the res­ults neg­lected or ignored. We need to improve what we do with the res­ults of these efforts, rather than neg­lect to do them at all.

In the mean time, the Japanese, and the world, have some cleanup to do.

The Problem with Probability

Risk Factors

Severity

There are two key factors that need to be under­stood when assess­ing risk: Severity and Probability (or Likelihood). Sometimes the term ‘con­sequence’ is used instead of ‘sever­ity’, and in the case of machinery risk assess­ment, they can be con­sidered to be syn­onyms.  Severity seems to be fairly well under­stood — most people can fairly eas­ily ima­gine what reach­ing into a spin­ning blade might do to the hand doing the reach­ing. There is a prob­lem that arises when there is an insuf­fi­cient under­stand­ing of the haz­ard, but that’s the sub­ject for anoth­er post.

Probability

Probability or like­li­hood is used to describe the chance that an injury or a haz­ard­ous situ­ation will occur. Probability is used when numer­ic data is avail­able and prob­ab­il­ity can be cal­cu­lated, while like­li­hood is used when the assess­ment is sub­ject­ive. The prob­ab­il­ity factor is often broken down fur­ther into three sub-​factors as seen in Figure 3 below [1]:

There is No Reality, only Perception…

Whether you use prob­ab­il­ity or like­li­hood in your assess­ment, there is a fun­da­ment­al prob­lem with people’s per­cep­tion of these factors. People have a dif­fi­cult time appre­ci­at­ing the mean­ing of prob­a­bil­ity. Probability is a key fac­tor in deter­min­ing the degree of risk from any haz­ard, yet when fig­ures like “1 in 1000” or “1 x 10–5 occur­rences per year” are dis­cussed, it’s hard for peo­ple to truly grasp what these num­bers mean. When prob­a­bil­ity is dis­cussed as a rate, a fig­ure like “1 x 10–5 occur­rences per year” can make the chance of an occur­rence seem incon­ceiv­ably dis­tant, and there­fore less of a con­cern. Likewise, when more sub­jec­tive scales are used it can be dif­fi­cult to really under­stand what “likely” or “rarely” actu­ally mean. Consequently, even in cases where the sever­ity may be very high, the risk related to a par­tic­u­lar haz­ard may be neg­lected if the prob­a­bil­ity is deemed low.

To see the oth­er side, con­sider people’s atti­tude when it comes to win­ning a lot­tery. Most people will agree that “Someone will win” and the infin­ites­im­al prob­ab­il­ity of win­ning is seen as sig­ni­fic­ant.  The same odds giv­en in rela­tion­ship to a neg­at­ive risk might be seen as ‘infin­ites­im­ally small’, and there­fore neg­li­gible.

For example, con­sider the decisions made by the Tokyo Electric Power Corporation (TEPCO) when they con­struc­ted the Fukushima Dai Ichi nuc­le­ar power plant. TEPCO engin­eers and sci­ent­ists assessed the site in the 1960’s and decided that a 10 meter tsunami was a real­ist­ic pos­sib­il­ity at the site. They decided to build the react­ors, tur­bines and backup gen­er­at­ors 10 meters above the sur­round­ing sea level, then loc­ated the sys­tem crit­ic­al con­dens­ers in the sea­ward yard of the plant at a level below 10 meters. To pro­tect that crit­ic­al equip­ment they built a 5.7 meter high sea­wall, almost 50% short­er than the pre­dicted height for a tsunami! While I don’t know what rationale they used to sup­port this design decision, it is clear that the plant would have taken sig­ni­fic­ant dam­age from even a rel­at­ively mild tsunami. The 11-​Mar-​11 tsunami topped the highest pre­dic­tion by nearly 5 meters, res­ult­ing in a Level 7 nuc­le­ar acci­dent and dec­ades for recov­ery. TEPCO exec­ut­ives have repeatedly stated that the con­di­tions lead­ing to the acci­dent were “incon­ceiv­able”, and yet redund­ancy was built into the sys­tems for just this type of event, and some plan­ning for tsunami effects were put into the design. Clearly was neither unima­gin­able or incon­ceiv­able, just under­es­tim­ated.

Risk Perception

So why is it that tiny odds are seen as an accept­able risk and even a reas­on­able like­li­hood in one case, and a neg­li­gible chance in the oth­er, par­tic­u­larly when the ignored case is the one that will have a sig­ni­fic­ant neg­at­ive out­come?
According to an art­icle in Wikipedia [2], there are three main schools of thought when it comes to under­stand­ing risk per­cep­tion: psy­cho­lo­gic­al, soci­olo­gic­al and inter­dis­cip­lin­ary. In a key early paper writ­ten in 1969 by Chauncy Starr [3], it was dis­covered that people would accept vol­un­tary risks 1000 times great­er than invol­un­tary risks. Later research has chal­lenged these find­ings, show­ing the gap between vol­un­tary and invol­un­tary to be much nar­row­er than Starr found.
Early psy­cho­met­ric research by Kahneman and Tversky, showed that people use a num­ber of heur­ist­ics to eval­u­ate inform­a­tion. These heur­ist­ics included:
  • Representativeness;
  • Availability;
  • Anchoring and Adjustment;
  • Asymmetry; and
  • Threshold effects.
This research showed that people tend to be averse to risks to gains, like the poten­tial for loss of sav­ings by mak­ing risky invest­ments, while they tend to accept risk eas­ily when it comes to poten­tial losses, pre­fer­ring the hope of los­ing noth­ing over a cer­tain but smal­ler loss. This may explain why low-​probability, high sever­ity OHS risks are more often ignored, in the hope that less­er injur­ies will occur rather than the max­im­um pre­dicted sever­ity.

Significant res­ults also show that bet­ter inform­a­tion fre­quently has no effect on how risks are judged. More weight is put on risks with imme­di­ate, per­son­al res­ults than those seen in longer time frames. Psychometric research has shown that risk per­cep­tion is highly depend­ent on intu­ition, exper­i­en­tial think­ing, and emo­tions. The research iden­ti­fied char­ac­ter­ist­ics that may be con­densed into three high order factors:

  1. the degree to which a risk is under­stood;
  2. the degree to which it evokes a feel­ing of dread; and
  3. the num­ber of people exposed to the risk.

Dread” describes a risk that eli­cits vis­cer­al feel­ings of impend­ing cata­strophe, ter­ror and loss of con­trol. The more a per­son dreads an activ­ity, the high­er its per­ceived risk and the more that per­son wants the risk reduced [4]. Fear is clearly a stronger motiv­at­or than any degree of inform­a­tion.

Considering the dif­fer­ing views of those study­ing risk per­cep­tion, it’s no won­der that this is a chal­len­ging sub­ject for safety prac­ti­tion­ers!

Estimating Probability

Frequency and Duration

Some aspects of prob­ab­il­ity are not too dif­fi­cult to estim­ate. Consider the Frequency or Duration of Exposure factor. At face value this can be stated as “X cycles per hour” or “Y hours per week”. Depending on the haz­ard, there may be more com­plex expos­ure data, like that used when con­sid­er­ing aud­ible noise expos­ure. In that case, noise is often expressed as a time-​weighted-​average (TWH), like “83 dB(A), 8 h TWH”, mean­ing 83 dB(A) aver­aged over 8 hours.

Estimating the prob­ab­il­ity of a haz­ard­ous situ­ation is usu­ally not too tough either. This could be expressed as “15 minutes, once per day /​ shift” or “2 days, twice per year”.

Avoidance

Estimating the prob­ab­il­ity of avoid­ing an injury in any giv­en haz­ard­ous situ­ation is MUCH more dif­fi­cult, since the speed of occur­rence, the abil­ity to per­ceive the haz­ard, the know­ledge of the exposed per­son, their abil­ity to react in the situ­ation, the level of train­ing that they have, the pres­ence of com­ple­ment­ary pro­tect­ive meas­ures, and many oth­er factors come into play. Depth of under­stand­ing of the haz­ard and the details of the haz­ard­ous situ­ation by the risk assessors is crit­ic­al to a sound assess­ment of the risk involved.

The Challenge

The chal­lenge for safety prac­ti­tion­ers is two­fold:

  1. As prac­ti­tion­ers, we must try to over­come our biases when con­duct­ing risk assess­ment work, and where we can­not over­come those biases, we must at least acknow­ledge them and the effects they may pro­duce in our work; and
  2. We must try to present the risks in terms that the exposed people can under­stand, so that they can make a reasoned choice for their own per­son­al safety.

I don’t sug­gest that this is easy, nor do I advoc­ate “dumb­ing down” the inform­a­tion! I do believe that risk inform­a­tion can be presen­ted to non-​technical people in ways that they can under­stand the crit­ic­al points.

Risk assess­ment tech­niques are becom­ing fun­da­ment­al in all areas of design. As safety prac­ti­tion­ers, we must be ready to con­duct risk assess­ments using sound tech­niques, be aware of our biases and be patient in com­mu­nic­at­ing the res­ults of our ana­lys­is to every­one that may be affected.

References

[1] “Safety of Machinery — General Principles for Design — Risk Assessment and Risk Reduction”, ISO 12100, Figure 3, ISO, Geneva, 2010.
[2] “Risk Perception”, Wikipedia, accessed 19/​20-​May-​2011, http://​en​.wiki​pe​dia​.org/​w​i​k​i​/​R​i​s​k​_​p​e​r​c​e​p​t​ion.
[3] Chancey Starr, “Social Benefits versus Technological Risks”, Science Vol. 165, No. 3899. (Sep. 19, 1969), pp. 1232 – 1238
[4] Paul Slovic, Baruch Fischhoff, Sarah Lichtenstein, “Why Study Risk Perception?”, Risk Analysis 2(2) (1982), pp. 83 – 93.