Narrative Techniques Can Mess Up Your Business Reporting

by July 26, 2021
Computer Keyboard
“Computer Keyboard” via Marcie Casas CC BY 2.0

Great narrative is prized in journalism. Long stories with protagonists, antagonists, narrative arcs, color, drama, conflict, intrigue … sorry, got carried away.

That’s the potential problem facing business journalists and publications craving a juicy story. The demands of the form can cause people to miss what’s really going on.

Some of the most damning evidence comes the area of great corporate scandals of the last two decades. The likes of what became corporate catastrophes, and the charismatic—for a while, anyway—people behind them were stories too good to pass on.

But like a conjurer’s misdirection, the narratives become distractions to reporters while a different and conflicting story lay underneath. The business and technology genius of Enron, Arthur Andersen, WorldCom, Bear Stearns, Lehman Brothers, Washington Mutual, Bernard L. Madoff Investment Securities, Theranos, Wirecard, and so many others hosted huge voids.

And those are only the ones that are publicly known.

Facebook’s relationship to privacy provides a good example. There were clearly problems for many years. Then came a 2019 New York Times article about the company’s chief technology officer. The headline—”Facebook’s A.I. Whiz Now Faces the Task of Cleaning It Up. Sometimes That Brings Him to Tears.”—is a terrific encapsulation.

When asked about the Christchurch, New Zealand shootings and the time Facebook took to remove the videos, here’s how reporters Cade Metz and Mike Isaac conveyed it:
Mr. Schroepfer went quiet. His eyes began to glisten.

“We’re working on this right now,” he said after a minute, trying to remain composed. “It won’t be fixed tomorrow. But I do not want to have this conversation again six months from now. We can do a much, much better job of catching this.”

The question is whether that is really true or if Facebook is kidding itself.

Even though there’s a focus on a problem, the story was, as Karen Hao wrote in Technology Review, “a humanizing profile of a sensitive, well-intentioned executive striving to overcome the technical challenges of filtering out misinformation and hate speech from a stream of content that amounted to billions of pieces a day.”

While her piece was initially, and intentionally from Facebook’s PR view, supposed to focus narrowly on Joaquin Quiñonero Candela, a director of AI at Facebook, and a project about AI bias, Hao got to deeper issues:

Everything the company does and chooses not to do flows from a single motivation: Zuckerberg’s relentless desire for growth. Quiñonero’s AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoid proposed regulation that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.

She managed to incorporate narrative and still not lose track of the real story, which was about Facebook’s claims, business goals, and actions. Had that been the case early on, could more attention or action have resulted faster? Maybe, even if there were some dry eyes in the audience.