Over the last couple of weeks we have been working with a client on reviewing and testing their emergency plans. The client has been really engaged and facilitated a range of realistic drills to test and measure the actual effectiveness of the plans. It got me thinking about business in general though. I’m sure that this client is the exception rather than the rule when it comes to testing. How many businesses test their plans and how many of those tests are a realistic appraisal of what the plan will look like in reality?
Testing the wrong thing
I’ve worked in and reviewed many businesses who waste their time testing emergency plans in a way that will never be replicated in an emergency. For example:
- Hotels generally conduct fire alarm tests and fire drills at 12 noon on a Monday or Tuesday. Highest staffing level and lowest occupancy. Can this really be regarded as a reflection of how an evacuation might look at 0400 on a Saturday night with only the Night Manager and Night Porter on duty and a building full of sleeping guests?
- Nightclubs who do fire evacuation drills at 2230 just as they open and with 5 or 6 people inside.
- Paper based emergency plans that are written down but never acted out. For example I once did bomb threat training with a venue who’s plan was to lock the doors to keep people out in the event of an explosion outside. The only problem was that the entire front of the building was made of glass.
- A corporate security building we conducted a security review on a number of years back who bring in all members of the security team twice a year to conduct emergency drills. These days were the only days in the year where all members were present.
All of these are examples of where tests are completed but what is being tested is not the reality.
The value of emergency plans
Emergency plans when written down are great but I always think of them like buying a new car from a showroom. As soon as the ink is dry on the paper the car loses value. Same thing with emergency plans. Unless they are kept up to date and relevant they lose value very quickly. If you regularly clean, service, maintain and upgrade the car it holds value. If you do not it ends up being scrapped for a whole new vehicle very quickly. Same principles apply with emergency plans.
Plans need to be tested and tested in context. Tested, reviewed, tweaked and upgrade over time to keep themselves relevant. Otherwise they are simply security theatre. Unfortunately this requires investment tin time and resource and sometimes in inconvenience to our employees and customers. That’s the trade-off of keeping them safe against talking about keeping them safe.
Some things that I have found useful for testing:
- Chunk it down. Do lots of small or partial tests over the year with a single large one. Small drills build habits and allow you drill into specific challenges within specific areas. Culminate in one large and meaningful set of tests per year.
- Use penetration tests. Get others to test your plans regularly. Particularly access control plans and responses. Having an outside eye look and actively test your plan can point out areas that you may have become blind to .
- Equipment practice: Outside of test times have your teams practice with equipment. Whether its emergency radio calls, fire equipment, generator tests etc. it builds proficiency, confidence and habit.
- Research: Look at other emergencies around the world and run through ‘what if’ scenarios with your team. Learning from the emergencies of others is much more cost effective than waiting for it to happen to you.
- Collaboration: Try to organise a large collaborative emergency exercise once a year. Collaborate with other local businesses, government agencies (police, fire etc.). Publicise it and show your customers how proactive you are about their safety. I’ve facilitated a number of these over the past number of years and the public feedback has always been great.
Emergency plans can be a critical part of a security or risk management system or they can be a complete waste of everybody’s time and effort. Sometimes it better to do nothing than to test the wrong things in the wrong way and build false hope and bad habits. It doesn’t have to be difficult or too inconvenient. You can’t measure success in emergency preparedness by the fact that nothing bad has happened yet. Hope as a strategy isn’t reliable. Thoughtful risk assessment, planning and testing can provide peace of mind and assurance that if or when something happens the response will work. We know because we have tested it.