Monitor & Educate

One of the finest investigative reports I have ever read is not from a fraud investigation, but rather an analysis of the Columbia Space Shuttle disaster.  Not light reading at nearly 250 pages, but it provides an outstanding education on process, controls, and lessons for organizations.  The link to the report is here.

The object is not to dissect NASA, although the report does identify plenty of things that went wrong, particularly in their inability to change the culture and learn from the mistakes that contributed to the 1986 Challenger disaster.  The real gem of the report lies in the comparisons between NASA and the U.S. Nuclear Navy.

Both are defense organizations, and both employ large, complex machines that explore places where a machine failure results in immediate death of the crew.  NASA suffered two tragedies within 20 years; meanwhile as of the report date the U.S. Navy's nuclear-powered warships had traveled the equivalent distance of 265 lunar round-trips without an accident.

The report goes on to examine the thematic factors of why the investigative board members believe the Navy has had such a string of success.  You organization leaders, accountants, and auditors might recognize these items as essential elements of a strong control environment.  The categories listed in the report:

Communication and Action: Relevant personnel at all levels are informed of technical decisions, and PowerPoint briefings and papers do not substitute for real work.  In the fraud world, as in the world of complex machinery, these problems do not just happen to others - they can affect us as well, so let us seek and illuminate the symptoms so we can deal with them early rather than ignore them hoping they will take care of themselves.

Recurring Training and Learning from Mistakes: The Navy does not only learn from its own mistakes, it learns from others.  Ten years after the Challenger accident, the Navy was still sending thousands of its people to education programs to learn from the disaster.  We do not have to suffer our own frauds to learn what can go wrong in our organizations - look at the documents in my library here, check out some of the excellent books on my suggested reading list, set up Google Alerts to ping you when a fraud is reported, and regularly read The Wall Street Journal, which will report on a major fraud at least 3-4 times every month.

Encourage Dissenting Opinions:  Officials at the Navy are not arrogant enough to believe that because no one is dissenting, they must have the perfect plan.  If there are no dissenting opinions, then perhaps they did not examine What Can Go Wrong closely enough. In healthy organizations, rank does not trump facts.

Knowledge Retention: Directors serve long terms, and their programs document the history of technical decisions.  Current and past issues are discussed in all-hands open forums.   

Worst-case Scenarios: The Navy examines and prepares for a wide range of worst-case scenarios.  It is not about being pessimistic and cynical, and it is not about preparing to fail.  In fact, because they consider and prepare for the worst cases, their safety record represents a model for us to follow.

Fraud, for some organizations, is a worst-case scenario.  It was for Enron, WorldCom, Sunbeam, Arthur Andersen, Bernie Madoff, Judge Mark Ciavarella, and Barings Bank.  But for each of these organizations, there are many organizations you never hear anything about.  Disciplined organizations do not often make the news, and that suits their employees just fine.

Return to the Library >