terça-feira, 20 de dezembro de 2011

Futuro do jornalismo são as "narrativas baseadas em factos" (ou a literatura com rigor)

Excertos do texto de Philip Meyer "Precision Journalism and Narrative Journalism: Toward a Unified Field Theory", de 3 de Outubro de 2011 (original aqui):

«[On the] second half of the 20th century, when journalists began experimenting with two new ways of making the quest for truth more manageable. Precision journalism borrowed the tools of science. Narrative journalism was based on art. In their early stages, these two approaches seemed to be in conflict. My argument today is that, in the 21st century, we should consider the possibility that we need both.
[…] Science creates structure with what Lippmann called schematic models, which come from theory. Art creates structure through the narrative design in storytelling. Is it possible for us to find ways to merge these two strategies and tell stories about the data that are grounded in verifiable theory?

[…] In 1974, the year that Walter Lippmann died, Everette E. Dennis and William L. Rivers published a book titled “Other Voices.” It catalogued what their subtitle called “The New Journalism in America.” They labeled one of their categories “the new nonfiction.” Today it is known as “narrative journalism” or “creative nonfiction.” In recognition that it has become a mainstream technique, Harvard University’s Nieman Foundation ran a well-attended annual conference on narrative journalism from 2001 to 2009. Attendance peaked at nearly 900 in 2008, before the recession discouraged discretionary travel. The genre first started to appear in the 1960s. Its fiction-like techniques include internal monologue — what a newsworthy person was thinking — and detailed character development and scene building. Early practitioners included Gay Talese and Tom Wolfe and Jimmy Breslin. Some novelists tried it working from the other direction, creating the nonfiction novel. There was Norman Mailer’s “The Executioner’s Song” and Truman Capote’s “In Cold Blood.”

It was also in the 1960’s that some of us began to apply social science research methods—including statistical analysis and hypothesis testing—to the practice of news reporting. This genre is often called “precision journalism,” a term coined by Dennis. He and Rivers saw the conflict. The narrative journalists, they said, “are subjective to a degree that disturbs conventional journalists and horrifies precision journalists. In essence, all the other new journalists push reporting toward art. Precision journalists push it toward science.”

[…] Both genres, narrative journalism and precision journalism, are special forms requiring special skills. If we were to blend the two, what should we call it? I like the term “evidence-based narrative.” It implies good storytelling based on verifiable evidence.
Yes, that would be an esoteric specialty. But I believe that a market for it is developing. The information marketplace is moving us inexorably toward greater and greater specialization.

Since the end of World War II, journalism has been evolving from a mass production model to one of more intimate communication. Traditional media were manufacturing products. They required economies of scale to cost-justify their means of production – a printing press or a broadcast transmitter. And so journalism was a matter of creating a few messages designed to reach many people. But as technology increases the number of channels, the new information economy supports more specialized content—many messages, each reaching a few people. That means, as a public, we have fewer common experiences that build common values.

[…] An environment that rewards specialization need not limit itself to subject matter specialization. It can also build a specialty based on methodology. Both precision journalism and narrative journalism appeal to a sophisticated audience, one that appreciates the need for information to be structured in a way that focuses attention on the truth.

That is why it is not so wild a dream that evidence-based journalism, incorporating precision with narrative, could fill a need for trustworthy interpretation and selection of the relevant truth from the eternal data stream.

[…] The need for systems that synthesize and process data into shared knowledge will, I predict, become obvious. Unprocessed data is indistinguishable from noise. As the unending stream of data increases, so will the demand for institutions and better methods to process it.
Citizens can offer Twitter updates from a scene, and reporters can look for patterns and determine which tweets might be self-serving or fraudulent.

[…] Greater analytic and narrative skills will be needed. It won’t be often that the two skills, analysis and narrative, can be combined in one reporter, so we’ll need more team reporting and editors capable of recruiting and managing the necessary talent. In other words, the old media will have to change, too.
ProPublica, based in New York City, teamed with The Seattle Times last December to report on the home mortgage crisis in the USA. ProPublica supplied a computer specialist, Jennifer Lefluer, to do the heavy lifting in the data analysis, and the Times provided a reporter, Sanjay Bhatt. They drew a probability sample of about 400 foreclosure filings in each of three widely separated metropolitan areas, Seattle, Baltimore and Phoenix. Their jointly written story combined quantitative analysis with human interest reporting, and it showed vividly how the combination of relaxed lending standards and inflated home prices caused the crisis.
Institutions like ProPublica have an enormous opportunity. They can stand out so clearly from the noisy confusion of information overload, their value could be so rare, that everyone will want to pay attention to them.»