Here is an article from our national crisis preparedness lead Vlad Grigore that proposes companies take a different approach to testing emergency preparedness. This piece was originally published at Oilweek.com on January 29, 2014.
Emergency preparedness is a funny business. We plan and think and practice and train, getting ready to do a job that we hope we never have to do. And when there are no emergencies—99 per cent of the time—emergency preparedness seems only a drain on a company’s budget. Its activities are often put on the budget and time back burner. Many companies realize emergency preparedness is an area that cannot be completely neglected but, as simple economic theory teaches, the future is discounted heavily enough that to most it makes more sense to save money now, instead of worrying about a catastrophe that may come tomorrow.
Practitioners of emergency response often have to state publicly that their organizations are well-prepared (since saying they are not implicitly means practitioners aren’t doing their jobs). But privately most will quietly complain, correctly, that their organizations don’t do enough to be really and truly well-prepared to handle a crisis. I am pretty sure that unless human nature changes dramatically, emergency preparedness will continue to sit on a back burner—until there is a catastrophe. Even then, a crisis only buys emergency preparedness programs a short window of increased scrutiny (and commitment and funding). With time memories fade and people start to discount the future again.
So if there is nothing we can do to ensure that continued appropriate attention and commitment are devoted to emergency preparedness, is there anything we can do to improve our chances within the current paradigm? I would argue that there is: we can stop taking tests.
To some, this is pedagogical heresy: “What do you mean? We shouldn’t test our emergency response programs? How will we know if we are any good? How irresponsible!”
Well, in a way, these naysayers are right. We do need to evaluate our level of preparedness and the skills and abilities of our personnel. So perhaps tests are still a good idea. But why can’t we take them with open books? And why can’t our teachers explain a question when we don’t know how to answer it while we are taking the test?
Most current emergency response exercises happen annually. They occur after a small amount of training and they are evaluated by an exercise facilitator who is often a third-party provider of this service. The participants come in with little preparation and try their hardest to successfully follow a plan and/or a process that they have very little familiarity with. They are evaluated, and at the end of the exercise they are told they did really well for something that isn’t part of their day job. They practise doing the wrong things (and some right things), are corrected at the end and then don’t get to practise their corrected actions until the following year. And this is all done in the name of testing—of making sure our organizations know what to do, when we know very well that they really don’t.
For a “well-oiled” (pun intended) emergency preparedness team, one that practises all the time, I think a purely evaluative exercise is an excellent idea. The team gets lots of practice and coaching, so why not test in objective conditions—and pile on the pressure—to make sure they really are as good as they think they are?
But for an organization that only practises once a year, for participants who spend the majority of their training time in a single exercise, I don’t think we can waste the learning opportunity on pure evaluation. These organizations (and they are the majority) must use this opportunity to learn as much as possible. So put on the exercise, still with little preparation, if there isn’t the commitment for more. But use it to teach and coach the whole time. Have facilitators assist participants from the get go. Allow them to start down the wrong path or to make mistakes, but correct them quickly and make sure that in the end, everybody practises doing the right things as opposed to the wrong things.
Under such a paradigm, participants not only learn much more, but because they are actually doing the right things—as opposed to only hearing about them after—they will retain their lessons for longer. And when a real emergency happens, they will be much more likely to respond appropriately, saving lives, assets and reputations.