Main Menu
Join Undercurrent on Facebook

The Private, Exclusive Guide for Serious Divers Since 1975 | |
For Divers since 1975
The Private, Exclusive Guide for Serious Divers Since 1975
"Best of the Web: scuba tips no other
source dares to publish" -- Forbes
X
November 2017    Download the Entire Issue (PDF) Available to the Public Vol. 32, No. 11   RSS Feed for Undercurrent Issues
What's this?

If You Make a Mistake …

don’t be afraid to talk about it

from the November, 2017 issue of Undercurrent   Subscribe Now

If a near-miss occurs in aviation, everyone, including pilots, airline companies, government agencies, passenger witnesses, air traffic controllers and airport administrators, all works together through federal agencies -- the NTSB and the FAA -- to find the cause.

One reasons for involving all the parties is because, on average, it takes at least seven mistakes to cause a catastrophic error in aviation. So there's a lot to learn from taking apart the details and looking at the context-rich data across different areas. Investigators have the help of black box recorders, video surveillance systems, team-based communication protocols, and practiced simulation to maximize safety effectiveness. And, they have a level of legal protection that allows candid information to be protected from the judiciary.

Of note, even if a single individual's actions are the sole contributory cause of the accident or near-miss, the individual who has managed to fail at this level exposes a failure of the system somehow. That, the industry agrees, is a good thing to discuss and fix.

Diving has a Lesson to Learn

Now consider what happens in the diving industry when someone puts his/her hand up and says "We had a close call ... here is what happened."

Often it includes discussions about missed checks, incorrect assumptions, poor briefing, inadequate assessment of the risks, changes that were obvious after the event ... and so on.

The armchair pundits, using online forums and social media, start pointing fingers, and say: "You should have done this or that," and don't look at the systemic or cultural issues at play. Hindsight bias provides a clarity that is not possible in real time.

I have recently seen two instances, one involving a four-person dive team that was drifting at 230-260 feet (70-80m) and got separated from their chase boat because their skipper didn't spot their surface marker buoys. They were picked up nearly 10 km from the drop-in point.

The second was a fatality in the Far East when a group of Open Water divers entered a wreck at 130 feet (37m), each using a single cylinder. The inside of the wreck silted out, and one of the divers got lost and died.

The result of this online criticism? People stopped telling stories, which would have allowed other people to learn from the experience.

You can't teach everyone everything in a training class. You don't have enough time; nor does the instructor have enough experience. Therefore, divers have to learn from others' mistakes, and the decision made in near-misses is instructional, even if the divers broke the 'rules.'

The acceptable level of risk an individual takes is a personal construct ... with hindsight, you are always better informed than the person at the time.

The diving community does not have an FAA or a CAA or an overarching regulatory body (which I think would add a level of complexity to a recreational activity that's probably not needed).

Furthermore, diving does not have an independent investigative body such as the NTSB. So, any investigation conducted is often protectionist in nature due to the threat of legal action. Indeed, one agency's incident form says, "This form is being prepared in the event of litigation." What are the chances that the 'real' story will come out if it means that rules, guidance or processes have been broken?

Human error is normal. If there are consistent errors leading to a near miss, these are systemic issues, not individual issues. However, to identify where those systemic issues lie, we need to collect data that can be collated and analyzed using a standard framework not only covering the proximal cause, but also that allows us to identify systemic issues. Aviation is as safe as it is because they have learned to recognize that human error and failure are normal, and participants and investigators talk about it in a non-judgmental way.

Crucially, you can fire an individual, but if you don't change the system, the failure will continue to happen.

-- Gareth Lock

So we at Undercurrent suggest that if you make a mistake, you talk about it. Post a description of what happened online. Share the experience so that others, too, may learn from it -- and weather the storm from the armchair pundits. They are of no consequence.

Gareth Lock is a retired military aviator with a passion for improving human performance and diving safety using his direct experience and knowledge of human factors and non-technical skills to facilitate this. He runs a consultancy, which has developed globally unique online and face-to-face training classes for the sport diving community to improve diver safety and performance. He regularly writes and presents on this subject. www.humanfactors.academy

I want to get all the stories! Tell me how I can become an Undercurrent Online Member and get online access to all the articles of Undercurrent as well as thousands of first hand reports on dive operations world-wide


Find in  

| Home | Online Members Area | My Account | Login | Join |
| Travel Index | Dive Resort & Liveaboard Reviews | Featured Reports | Recent Issues | Back Issues |
| Dive Gear Index | Health/Safety Index | Environment & Misc. Index | Seasonal Planner | Blogs | Free Articles | Book Picks | News |
| Special Offers | RSS | FAQ | About Us | Contact Us | Links |

Copyright © 1996-2021 Undercurrent (www.undercurrent.org)
3020 Bridgeway, Ste 102, Sausalito, Ca 94965
All rights reserved.

cd