Lessons we can learn from PwC’s Oscars envelope fiasco
Okay, everyone has had a week to laugh at Big 4 partners who can’t maintain the level of attention to detail that is normative for all CPAs below the partner level or to laugh at the incompetence of bean counters in general or to laugh at Hollywood embarrassing itself for two different errors in one show. Your choice for amusement depends on your world view.
Okay, give yourself one more chuckle.
Are we all done now?
It might be time to see what we can learn from this fiasco other than pay attention to details. In my brief blogging career, I’ve learned that studying major news stories while still a bit fresh is a superb way to learn. We are still curious and so will pay attention for just one more moment.
Let’s look at this fiasco from a disaster theory perspective.
Slate has a superb article on March 3 that can help everyone learn: How Disaster Science Explains the Oscars Mix-Up. The subtitle, which I’ll quote, gives a great summary: Major errors don’t cause disasters. Banal mistakes and human nature do.
I will describe a number of the ideas in the article and provide by observations.
The author’s contention, along with the point of a specifically cited book, is that massive disasters normally are the result of a series of smaller, quite human mistakes. String together several of those non-serious errors in an unfortunate series and a massive disaster can result.
A key quote in the article says that the typical cause of disasters is banalities and trivialities.
Article suggests a series of minor issues combined into the fiasco we saw last week. Consider a few of the contributing factors in the article.
The Academy has an outsider tally the results and keep them secret until seconds before each announcement to minimize the risk of a leak. In 1940 the results were released hours before the event. Article says you could have picked up a paper on the way to the show which listed the winners.
With the speed of social media today, a leak while the nominees are being announced could be around the world and known by half the television audience while the clips are being played. When every person working on the production and in the live audience has a smart phone with internet access, the risk of leaks today is astronomical.
No staff of the event know who the winners are, so they can’t identify an error. That means eliminating one risk (early release) with another procedure (only 2 people know the results) creates new risks from the new procedure.
Studies suggest that highly experienced people doing critical work that is routine (for example, partners of the CPA firm handing out the award cards) might become complacent in those routine tasks.
My observation is that CPA partners with 20 or 30 years experience are the exact right people to make a judgment call on pulling the trigger to add a going concern qualification to the auditor’s report or deciding whether to drop a litigation disclosure, but those folks just might not be best for items that require extreme attention to detail.
Can you imagine having a 30-year partner perform a test of transactions on a student financial aid audit with 60 sample items, each with 30 attributes to test? Yeah, I can’t quite picture that as working well either. Far better to have the audit senior run that tedious test or more likely the experienced staff person.
Another factor not discussed much is that 83 years is a really, really long time to have one CPA firm in place. I oppose the concept of mandatory auditor rotation. An arbitrary cutoff of some randomly determined number of years to change firms won’t provide any value.
On the other hand, as years (or decades!) pass the familiarity risks rises. That risk should be considered, both by the client and the firm. If you thought an automatic “do it SALY” (same as last year) was bad and “we’ve always done it this way” is worse, just imagine trying to stand up to the “we’ve done it this way since your dad was a staff accountant on this job”.
Another procedure was put in place because it is possible that a presenter might enter the stage from the opposite side than was rehearsed. Thus, there are two sets of envelopes, one at each side of the stage. In turn that creates the risk of an error in handing out the cards because there will be a left-over spare of every completed award.
Article says many accidents occur at the end of a project. Cited example in the article is that most mountain climbing accidents happen on the descent. I can understand that: after the excitement of reaching the top (Yeah! We did it!), climbers could easily relax on the way down, thus increasing the risk of injury. The disaster at the Oscars was on the last award of the night. I can just imagine the reaction of the partners after they handed out the last card, since their job was complete (Whew! Nailed it again! Hey, I oughta’ get some A-lister pics real fast since the cards are all gone!).
The envelopes were redesigned this year to be more appealing on camera: subtle gold letters on red look pretty but are harder to see than black letters inside a while label. Thus the envelopes were difficult to read. Check out the published photos of the card carried to the podium by Mr. Beatty – even blown up larger that full size it is difficult to read the name of the award.
Brian Cullinan could have prevented the disaster if he had looked yet again at the envelope to quadruple check, or quintuple check, he was handing out the correct one.
The same concept applies to Warren Beatty, according to the article. Multiple articles I’ve read, including this one, point out that the physical actions of Mr. Beatty indicate he knew something was wrong.
What was the visible problem? The card gave one female’s name and a movie name. The card ought to have read one movie name and multiple individuals. The results should have been different from the previous award; after all, how can the results on two consecutive awards be exactly identical? The barely readable outside of the envelope didn’t say best picture. Mr. Beatty apparently knew something was wrong. Article points out for those of us (like me) who didn’t already know that he is experienced in these award shows.
The article brings up the idea of ‘sense-making’ at this point.
When we encounter a situation that departs radically from what we were expecting, it takes a few moments to figure out what happened and to make some sort of sense of the situation. We need a bit of time to fit this weirdness into our frame of reference.
I experienced a teeny tiny example of this on an audit some time back (no identifying details or even timeframe will be provided). I observed something that while wasn’t wrong, was a serious inefficiency in the client’s procedures. I ought to have recognized there was something way outside the range of reasonable but because I was focused on another audit and I’d never seen this peculiarity. I didn’t pick up on it.
A while later, the person I’d talked to earlier described how inefficient the thing was that I observed. I immediately realized what I saw earlier and then knew there was an issue. But I missed it when I first saw it. It didn’t make sense at the time.
That sense-making concept explains several things: the pause by Mr. Beatty and then his looking to Faye Dunaway as if looking for help or verification of an error or some answer to the oddity or validation that it was correct. It also explains why Mr. Cullinan and Ms. Ruiz froze. It took a few moments to process that something just went horribly wrong. In terms of the concept cited in the article, they all had to take some time to make sense of what just happened. That would be the ‘freeze’ we’ve read about.
Ms. Dunaway also missed the error. Eeeeeveryone knew, just KNEW, that La La Land was going to win best picture. When Mr. Beatty turned the card to her, she didn’t notice anything amiss, because she wasn’t even expecting to read the card. She apparently only saw the name of a movie. So she apparently figured that Mr. Beatty was giving her the honor of making the announcement. A perfectly reasonable assumption to make and since seconds are precious, one ought not decline by saying “no, you read it”, with the response “oh, it’s okay, you read it”, and then a gracious “gee, thanks, you are so kind.” Thinking there would be an error would not have even been an idea in her mind – I get it – there is absolutely no reason to even consider that possibility when Mr. Beatty unexpectedly handed her the card. Missing critical information (like an individual name instead of a long list of individuals) in a high-pressure moment is human nature.
Article discusses that freezing in critical moments is also a human reaction. Article asserts that both the Three Mile Island and Deepwater Horizon disasters could have been prevented by instantaneous reactions from operators.
I have previously read that the TMI disaster was a result of a long string of minor issues. For merely one example, there was one piece of equipment wasn’t working, so a ‘danger tag’ was properly hung on the switch or circuit or whatever.
Good. That is what should be done.
Unfortunately, the location of the switch, the length of the string holding the tag, and the size of the tag was exactly the wrong combination so that the paper tag covered a critical warning light. Thus, a flashing light in the middle of a crisis which would have provided one more warning to the operators was covered by the properly hung danger tag.
Maybe, just maybe, if that flashing light had not been covered, none of us would know what TMI means because there would be nothing to talk about.
Ironically there is actually a resilient feature in play in this story – something which compensated for other issues.
Article says Jordan Horowitz, La La Land producer, stepped up and announced the actual winner. He had seen the correct card (according to the article) and knew what must be done. While everyone else on stage was sorting out what to do, he had a few moments for ‘sense-making’, realized the problem, and took action. I imagine it was horribly painful to make the announcement, but he did what was urgently needed at that second.
Another irony, according to a different article I read, is that the proper protocol for an announcement error is to tell the presenter and then have the presenter make a correction.
So on one hand, Mr. Horowitz’s announcement was a violation of protocol. He did not follow the plan.
On the other hand, he recognized the error, knew what should be done, knew that no one else was ready to make a correction, and likely realized that sorting out the error, then describing it to Mr. Beatty (or should you tell Ms. Dunaway since she made the announcement? All in favor of Ms. Dunaway raise your hand. Oh, okay, Mr. Beatty it is. Where is he standing, oh yeah, over there), and then having Mr. Beatty step to the mic around all the other people on stage, and then him making an announcement would burn up far more time that it took you to read this intentionally jumbled paragraph. The show could not afford another 30 seconds of tumult with 30 million people on the edge of their seats waiting for the results.
Mr. Horowitz did what was needed. He broke protocol. Winner corrected. Show on track. Get off stage. Now!
At least one step in the process was resilient.
So, there are many places that some little thing went wrong, according to the article. Any one of those pieces not going wrong would have prevented the disaster or corrected it far faster. (Auditors call that prevent or detect & correct.)
It looks to me like Mr. Cullinan deserves a large share of the blame. It was his job to hand out the correct envelope. However, we need to keep in mind there are many other places in the process where a little thing went wrong.
Now we have some things we can learn.
I’ll start the list:
- Assign tasks to people with the right skills.
- Pay attention to details.
- Double check yourself multiple times on critical details (check a fourth time that the price is correct in the proposal; check yet again the word “not” is removed from or added to that critical sentence).
- Try not to create new problems when you add a new procedure to fix a previous problem.
- If something seems really wrong, stop to figure out what is going on.
- We can’t change our human nature, so realize we have an inborn tendency to freeze (article says animals that are frightened by the appearance of a predator and freeze as a result might just live longer than an animal that bolts).
- Build in redundancies.
- Build in some resilience to procedures.
What do you think? Did I misinterpret something? Get something wrong?
What lessons can you learn from this fiasco when you look at it from a disaster theory perspective?
Professional comments welcome. I’ll be the one unilaterally determining what is professional.