- Is the river behind your house rising? A British Twitter bot will tell you
- When a digital subscription costs more than a print one
- From Grumpy Cat to Ukraine: How Mashable is expanding beyond gadgets and apps
- Data visualization is good. Data visualization that works on your phone is better.
- CIR wants to turn investigative reporting into a weekly public radio show with Reveal
- This Week in Review: The Fox/Time Warner dance begins, and clickbait and its discontents
- Public records tools, reader metrics, and more among new Prototype Fund winners
- Germany is getting a data-centric nonprofit newsroom and hoping to build new models for news
- The newsonomics of the new quest for big, big, big
- A change in the IRS process for granting tax-exempt status could be a boon to nonprofit news
- I’m feeling lucky: Can algorithms better engineer serendipity in research — or in journalism?
- WFMU wants to build open tools to help radio stations (and others) raise money and build community
- ProPublica sees $30,000 in new revenue from Data Store
- It’s great to resurface old stories, but it’s also great to let readers know what you’re doing
- Was Sports Illustrated used by LeBron James, or did SI figure out the right form for the story?
- Report around the clock: How some news orgs use time zones to their advantage to operate 24/7
- Twitter shut down @ReplayLastGoal
- This Week in Review: Facebook and online control, and educating stronger data journalists
- The Dallas Morning News abandons its “premium experience” strategy
- Pew study: New media outlets attempt to fill void in statehouse coverage across the U.S.
- Ken Doctor: Mind your own business, Facebook and Google
- At NPR, retweets are endorsements after all
- Alberto Cairo: Data journalism needs to up its own standards
- Q&A: Tarleton Gillespie says algorithms may be new, but editorial calculations aren’t
- Seeking Alpha doesn’t have to reveal the identity of a user who slammed a stock
- Amnesty International launches a new site to help journalists verify YouTube videos
- How The Wall Street Journal is celebrating its 125th anniversary while also looking ahead
- This Week in Review: Questions on Facebook’s experiment, and a knockout blow to Aereo
- The newsonomics of The Oregonian’s new editor’s challenge
- What will be the most dangerous threats to the Internet over the next decade?
Here's an interesting project from the data-oriented software developer Shoothill: GaugeMap is an interactive map with live river-level data from over 2,400 government gauges across England and Wales. From the announcement:
GaugeMap aims to help to look after and improve the natural environment by allowing these users to access this data on the move, wherever they are. Users can retrieve live data on actual river levels via the website, or by following the new, dedicated Twitter accounts that GaugeMap has established for each of the Environment Agency‘s 2,400+ river level monitoring stations they may be interested in. For example Teddington Lock now has its own Twitter account: https://twitter.com/riverlevel_1182. “GaugeMap will help any river user to be better informed, whether they use the river for recreation, pleasure or business,” said Rod Plummer, MD at Shoothill. “It also provides accurate, up-to-date information to help with water abstraction and so it could potentially be used to ensure the amount of water being abstracted from any river at any given time is sustainable and acceptable. Over-abstraction of river systems can cause changes in water quality, which obviously can have wide-reaching impacts on the wildlife that relies on our natural waterways, both directly and indirectly.”It's the Twitter integration that most interests me — over 2,400 accounts, each tied to a specific spot on a specific river, sending out alerts about water levels:
At 05:45 (21/07/2014) the river level at this station fell to 5.107m. More info: http://t.co/nX9UYpsO91 -- Bolton Percy (@riverlevel_1867) July 22, 2014
At 15:15 (21/07/2014) the river level at this station rose to 0.537m. More info: http://t.co/3mWhoFTHTR -- Winster Drive (@riverlevel_1345) July 21, 2014One could imagine ways to improve the bots. For instance, the accounts don't seem to be smart enough to automatically alert when the water gets dangerously high. The GaugeMap site tells me that Catcliffe Drain is in a "Flooding Possible" state, but you couldn't tell that from Catcliffe Drain's Twitter account. Still, the idea is powerful: a kind of distributed EveryBlock. One could imagine a local news organization gathering together data like this and pushing it out through neighborhood focused social media accounts, automatically and without human intervention.
Former newspaper editor John Robinson notes some unusual pricing at his old daily, the Greensboro News & Record:
The surprising thing to me – and which I believe is unusual for newspaper paywalls – is that the N&R is charging more for a digital subscription than for a print subscription. Currently, a 7-day, 52-week subscription costs $187.12. According to an ad in the newspaper today, the digital subscription is $215.40. (FYI, the subscription page on the website hasn’t been updated, at least that I can find.) In comparison, the News & Observer charges $390 for a year’s print subscription, and only $69.95 for a digital subscription. The Star News in Wilmington charges $218.40 for the print edition, and $131.40 for a digital subscription. But the N&R is cutting the other way. Editor/Publisher Jeff Gauger explains: “The reason for that variance? A print subscription permits us to subsidize the cost of content by providing access to your home or business for preprinted advertising circulars. A digital-only subscription lacks that advertising subsidy.”The N&R's move is unusual, but it's far from unprecedented. The Orange County Register is currently offering digital access for $3.99 a week, or digital plus Sunday print for $2.99 a week. That's right: They'll essentially pay you a dollar a week to take the Sunday paper. The New York Times has done something similar since launching its paywall — Sunday print gives you all-digital access at a price that's usually cheaper than all-digital access itself. (Currently: $8.60 a week for Sunday print plus digital, $8.75 a week for just digital.) What is a bit unusual here is pricing digital above a _seven-day_ print subscription, not just a Sunday print subscription. That is unusual. But Berkshire Hathaway-owned papers have gone against the grain before. The Omaha World-Herald charges even 7-day print subscribers for digital access ($7 extra a month!) and $25 a month for digital alone. The Tulsa World offers $14.99 a month for digital — or $14 a month for digital plus Sunday and Wednesday print. It's a weird world. Robinson:
Is the pricing structured to encourage digital users to subscribe to the paper? After all, the more subscribers a paper has, the more it can charge advertisers. (Despite what many readers think, advertising pays the bulk of the cost of a newspaper, not subscription fees.) I doubt this is the actual intent, but it does make some perverse sense to the consumer. Unless you can get what you need from the website from its 20 free articles per month.I can't speak for the News & Record, but at many papers, that is _exactly_ the intent. Propping up print numbers isn't the only reason to structure offers this way, but it's a big one.
Within minutes of the first reports that a Malaysia Airlines plane had crashed over eastern Ukraine Thursday, Mashable had live coverage up and running. Its real-time news staff in New York was updating the post with videos from the scene and carefully sourced information culled from social media and other outlets; its own social accounts, including its meant-for-breaking-news @MashableLive, were busy pushing out information. Meanwhile in Ukraine, Christopher J. Miller, a Mashable contributor, was working his sources and providing information to the main Mashable story while also writing his own piece as further developments unfolded. Miller and two editors in New York also cowrote a story highlighting leaked audio from an alleged conversation between pro-Russian rebels and Russian security forces discussing the plane. The breaking news story has been shared more than 30,000 times; Mashable's continued updating its coverage, including dispatches from Miller filed from the crash site.
.@Mashable newsroom trying to get to the heart of the #MH17 story http://t.co/A0nKUEmeSJ pic.twitter.com/QvkJIyfimK -- Lance Ulanoff (@LanceUlanoff) July 17, 2014For those who remember the site's early days — when it was a tech blog covering Web 2.0 startups — the idea of a Mashable correspondent reporting from eastern Ukraine is probably still a bit disorienting. But covering big breaking news this way has quickly become the norm for Mashable — the ongoing conflict between Israel and Hamas, the recent World Cup final, Emmy nominations, and, of course, virtually every major technology announcement. Executive editor Jim Roberts, who spent 26 years at The New York Times and a short stint at Reuters, was hired last October to lead Mashable's editorial operation and expand its focus beyond social media and technology. Mashable has an editorial staff of 77, adding nearly 30 of them since Roberts joined the company. While Mashable has maintained its traditional coverage areas — Thursday, for instance, it was using its Snapchat account to celebrate World Emoji Day — Roberts has also imbued a sense of old-school focus on covering the news, while still experimenting with how stories are presented, said Brian Ries, Mashable's real-time news editor. “You can have the 1,200-word article if you want that — that’s totally fine — but if you think the story can be better told in a series of Vines that are captioned with an explanation, then go with that," said Ries, who joined Mashable in February from The Daily Beast. "Or if it’s just a video you want to highlight, then do that. It’s been really freeing for me to come in here and be able to play with all different types of content because it feels like you can experiment.” A START IN UKRAINE The Ukraine story is one that Mashable has been paying attention to since the Euromaidan protest movement broke out late last fall. Roberts felt that the familiar east-west tensions playing out in Ukraine would resonate with Mashable's readers, and as a result it was was one of Mashable's first major attempts under Roberts' leadership to take on a significant global story. “Mashable’s core has really focused on the confluence between technology and digital culture, and those topics are still really essential to us and are really at the heart of what we do," Roberts told me. "I guess what I’m trying to do is, to the extent that this is possible, is cling to that core, reinforce it as much as possible, and then build around it in a very natural way." When the protests began gaining steam, Mashable started writing stories — including a couple by Roberts himself — but they were generally based on social media or other news reports. Kyiv Post, an English-language newspaper in Ukraine. Roberts had been following Miller and the paper's coverage of the growing protest movement on Twitter, and he wanted to know if Miller would write a story for Mashable on how he was covering the protests and how social media was being used as a tool in the Euromaidan movement.
@nycjim Hi Jim. To answer your question in short, yes. Would you follow me so I might be able to DM? Thanks. -- Christopher Miller (@ChristopherJM) December 10, 2013
@nycjim Just a quick note to tell you that the story should be in your inbox. Thanks. -- Christopher Miller (@ChristopherJM) December 11, 2013"The first story was very much gauged to their audience, which is a very digital-savvy group of readers — it was more focused to that," Miller told me via phone from Ukraine. "But eventually it just transformed into writing news. Everything from the daily story, if things were big enough and it called for that, to more exclusive pieces." Since that first December piece, Miller has written more than 70 stories for Mashable. Other journalists working in Ukraine have also contributed; in the months since then, Mashable has also published stories with datelines from Syria, Egypt, Brazil, and elsewhere. Roberts and others at Mashable readily admit that, with its relatively small staff, they cannot compete with major global news outlets on every story. "My job is very much about choosing what works for us, because we can’t cover everything and we don’t want to cover everything," said Louise Roug, Mashable's global news editor. She previously covered the Iraq war for the Los Angeles Times before becoming foreign editor at The Daily Beast and Newsweek. "That’s a great freedom for us." Instead, they say Mashable tries to focus on a wide array of topics — everything from climate change to entertainment — that its audience, which Roberts notes is "younger than most mainstream news organizations," cares about and is discussing online. "Now, of course we’ve got this big ambition and we’d like to be covering everything that people are talking about online," said Mashable managing editor Jonathan Ellis, a former New York Times senior editor for digital platforms who joined Mashable in March. "We’ll get there. But it’s also fun for us to find those areas of focus that we think will make an impact with our audience."
It's analogous to the way BuzzFeed started adding more traditional news — and later correspondents stationed overseas — into its mix of cats and listicles. Mashable has no plans to open foreign bureaus, said Roberts, instead relying on well-placed freelancers like Miller and other arrangements like Ukraine Desk, a partnership it launched earlier this year with Vice, Digg, Mother Jones, Quartz, and Breaking News to cover the Ukraine crisis. In March, for instance, the Turkish government blocked Twitter in the country. It was a perfect Mashable story, at the intersection of social media and politics, and the site's initial story on the block — which quoted social media posts, cited other news reports, and included some original reporting — has been shared nearly 31,000 times. "We can shine a giant flood light on some things that, despite all their obvious strengths, the BBC or CNN may not get to as quickly," Roberts said. "Not to say anything bad about CNN or the BBC, but when Turkey shuts down Twitter, it’s the lead story of Mashable. I don’t know of any other news outlet that was leading with Turkey." "A MORE NIMBLE ENVIRONMENT" As yet another winter storm was barreling down the East Coast last January, Roberts was reading coverage when he stumbled upon a phrase he had never heard before: the polar vortex. Wanting to know more, Roberts came upon the work of Andrew Freedman, who was then writing for Climate Central, a nonprofit that reports on and researches climate change. Freedman was one of the journalists who pushed the term polar vortex into the public lexicon last winter through his writing and media appearances. Roberts sent Freedman an email. Within 10 days or so, Roberts had offered Freedman a job. All of a sudden Mashable had a senior climate writer. "Maybe it was a bit impulsive, but it was one of the best hiring decisions I’ve ever made," Roberts said. "But that’s the kind of thing that you can do when you’re in a little bit more of a nimble environment." Mashable's staff has grown significantly since Roberts took over; it raised $14 million earlier this year partly to finance the editorial expansion, which includes adding staffers in Australia, London, and Los Angeles.
After his time in traditional media, Roberts said it was refreshing leading a digital-only organization where decisions like that could be made quickly and where the newsroom's resources to experiment don't have to compete with those earmarked for print products, still crucial to the bottom lines at major newspapers. "If you’re on one side or the other, it can cause frustrations, so when you’re in a digital-only organization, such as the one I’m working for, you don’t have that conflict — you don’t have that tug of war," Roberts said. "This is, in a sense, a luxury, but it's certainly one I enjoy having right now." CAN YOU NERD OUT? With more than 4 million Twitter followers and 2.6 million Facebook likes, Mashable says it draws about 34 million unique visitors a month. "We just know we’ve got a great team that does all the viral content," Ries said. "We have a great team that will pull together a funny list of GIFs from the World Cup. And that frees up the other side of the newsroom to go after stories that might not share as well, but are important to be told." When Freedman joined Mashable in early February, for instance, he was concerned he wouldn't be able to write in depth. The Mashable he was familiar with was more about aggregation and shorter stories. So when he received an instant message from Roberts after writing a explanatory story on a major ice storm in Georgia and South Carolina, he was a little concerned. "Of course, it’s like my first week of work, and I’m terrified that he just showed up on chat," Freedman said. "His message was, 'Can you please nerd out a little bit more?' So any concern that I had that there wasn’t an appetite for geeky weather, climate stuff was put to rest on like day one." But striking a balance between, say, writing about a selfie toaster and a weekend-long liveblog on the Ukrainian elections can be a tough task as readers need to change their expectations for what they can expect to see on Mashable. In its marketing material, Mashable defines itself as a site for the "connected generation," and it is betting that the readers it targets have an interest in more than just viral content and technology coverage. "Certainly, there will be some folks who see some things on Mashable they won’t have expected to see, and that’s a great thing," Ellis, Mashable's managing editor, said. "The whole point of running a great publication is getting them to see things they haven't seen before."
Photo of Grumpy Cat by Anna Hanks used under a Creative Commons license.
Two of the biggest trends in news today: the rise of mobile and the rise of data visualization. The unfortunate reality is that they're often in conflict. Too many beautiful data visualizations are designed with a big desktop browser window in mind, not the smaller screen of an iPhone or an Android phone. Text becomes unreadable, interactions become untappable, and a lovely experience becomes unusable. If you want to do better, check out MobileVis, a site built by Bocoup data viz whiz Irene Ros to assemble good examples of data visualizations that work well on mobile devices. (It's funded by a Knight Prototype Fund grant.) There are lots of screenshots, illustrations of pages in motion, and notes about what makes them compelling. Ros also pulls out a set of best practices for doing visualizations for mobile:
VERTICAL BAR CHARTS: When using bar charts in portrait mode, stack your bar chart bars vertically. USE VERTICAL SCROLLING: When creating interfaces that don't fit in their entirety on the screen, enable vertical scrolling instead of horizontal scrolling. STACK TABLE CELLS: When needing to display tables that have more than a couple of columns consider stacking cells vertically within each row. CAROUSEL INSTEAD OF TABS: When allowing users to switch between different displays, instead of using tabs (which require a lot of horizontal space,) consider using a carousel with next and previous buttons. FIX TOOLTIPS TO AREA OF SCREEN: When displaying information on touch, designate an area on screen that will update accordingly. USE TOUCH ZONES: When displaying a lot of data points that are hoverable/touchable, consider using defined touch zones instead.
It's a rare feat for the first episode of a brand new show to win a Peabody. And yet that's what happened with Reveal, the still new public radio show from the Center for Investigative Reporting. Over the past year, CIR has co-produced three pilot episodes of the series with PRX, focusing on new investigations into subjects like poor drinking water standards and the pathway for heroin from Juarez to Chicago. Along with the Peabody, those proofs of concept were enough to win support for the show from NPR stations and help secure $3.5 million in grants to help produce the show for several years. The money, a combination of $3 million from the Reva and David Logan Foundation, and $500,000 from the Ford Foundation, will go towards staffing up and taking Reveal to a full series launching in January 2015. The goal, according to CIR executive director Robert Rosenthal, is for Reveal to become a weekly investigative show, available on public radio or as a podcast. "Each of the pilots showed there is an appetite in public media for this," Rosenthal said. But turning Reveal into the public radio version of Frontline could be a challenge. Any reporter or editor will tell you that investigative stories take time. Producing such a show weekly is ambitious, and Rosenthal said they recognize they won't be able to meet a weekly pace overnight. "We've got to ramp up our own metabolism," he said. "If it's weekly or every other week, it's going to be a change for us. And important change and a good one that helps us get control of our own distribution." Ever since its founding in 1977, the Center for Investigative Reporting has looked for partners to help spread its journalism to the widest possible audience, often appearing in print, on TV or radio, and across the web. Reveal was no different; in the second episode, CIR collaborated with WBEZ and the Chicago Reader on the heroin investigation. For a story on arsenic-laden drinking water in the third episode, they partnered with the Center for Public Integrity.
Working in tandem with other newsrooms will be an important step in producing the show on a regular basis. Joaquin Alvarado, CEO of CIR, said the pilots were designed to see how collaborative reporting would work in producing a radio show. The show is produced by Susanne Reber and Ben Adair, and hosted by Al Letson, who also hosts the show State of the Re:Union. Where the first pilot only featured reporting from CIR, the subsequent shows were produced with stories from other outlets. A combination of both will be key to producing the show consistently, Alvarado said. "We have this goal as an organization to articulate a voice for investigative reporting and bring to audiences a lot of the great investigative reporting being done around the country," Alvarado said. Jim Morris, managing editor for environment coverage at the Center for Public Integrity, said their distribution strategy mirrors CIR's in that both look regularly for collaborators. "It just gets our work out there to a much broader audience, and a different audience," he said. In this case, reporter David Heath had already done preliminary work on his story looking at the EPA's response to arsenic levels in drinking water when the two organizations decided to team up. While CPI has worked on stories with NPR and individual stations before, Morris said this process was different because Heath was the lead correspondent for radio. That meant learning a new set of skills in addition to the usual reporting process, which Morris said took around three months. "It's a different thing to produce a 15-minute radio piece than a 4,000-word print piece," Morris said. A full-fledged radio show will provide CIR and its partners a more persistent channel for their work than one-off radio pieces. But getting the show on the air will require not just interest from stations, but flexibility in their programming schedule. The first two Reveal pilots aired on 150 stations nationwide, according to Alvarado. John Barth, managing director of PRX, said the pilots aired on stations in the top 30 to 50 radio markets, which he takes as a good sign for the future. PRX co-produces Reveal, bringing its audio expertise, but also is the show's distributor, helping to get it on the air. One reason Barth thinks the show will get picked up by more stations is because programmers want a full series they can give a regular home on their schedule, not a handful of one-offs with no promise for the future. PRX and CIR are interested in programs that can match radio storytelling with longform investigations, Barth said, and he thinks stations are interested too. "I think we both recognize there was a need for more regular investigative reporting in public media," he said. Barth, who was a founding producer on Marketplace, said launching a new public media show takes time — to find distribution, to build an audience, and to develop into the show you want. "The challenge is broad," Barth said. "Making sure you can amass talent on a regular basis to deliver this…investigative reporting doesn't happen on any kind of regular schedule." Of course, the other challenge will be financial. While the new grant funding will be important in launching the show, Alvarado said they'll be looking at underwriting, events, and other fundraising to support the show on a long-term basis. Alvarado is optimistic about the prospects for funding and building community around the show. "When you start to engage around harder reporting, people really want to talk about solutions," he said. "They want to have a regular platform to followup and talk about it."
THIS WEEK'S ESSENTIAL READS: The key pieces to read this week are David Carr and Ken Doctor on what's behind the push for big media mergers, John Borthwick on clickbait, sharing, and attention, and David Boardman's warning to newspaper executives.MURDOCH'S PLAY FOR TIME WARNER: Rupert Murdoch's latest in a long history of big media takeover bids was revealed this when The New York Times reported that his 21st Century Fox made an $80 billion offer for Time Warner that was rejected last month. 21st Century Fox, the entertainment media properties of the former News Corp (Murdoch's news properties now make up News Corp), would be buying an entertainment company that includes Warner Bros. movies, Turner, and HBO, now shorn of its cable/broadband business and publishing business (which have split off as Time Warner Cable and Time Inc., respectively). Mashable's Andy Fixmer, Quartz's John McDuling, and Bloomberg's Erik Schatzker and Caitlin McCabe all said HBO is at the center of Fox's pursuit of Time Warner; it would give Fox one of the world's premier content properties and, through HBO Go, a major tool to compete with Netflix in the growing streaming video market. Bloomberg said Fox values HBO at $20 billion — a quarter of its total offer for Time Warner. Business Insider's Jay Yarow argued that Fox's interest is a bit broader: It wants to gobble up as many valuable TV properties as it can to improve its leverage with TV distributors. At Reuters, however, Jack Shafer was skeptical of the value of the deal for Murdoch, comparing it to the protectionist consolidation of publishers in the 1990s. Slate's Jordan Weissmann said the deal may be as simple as Fox digging deeper into a still very profitable business (TV), and asserted that if Murdoch wants Time Warner, he'll eventually get it. Ad Age's Simon Dumenco also made that point, declaring Murdoch "untouchable and unstoppable." USA Today's Rem Rieder marveled at how Murdoch is bouncing back from News Corp's phone-hacking scandal. Still, Business Insider's Hunter Walker noted that antitrust regulators could stand in the way of a potential deal, and The Wall Street Journal's Keach Hagey looked at what a sale might mean for the Time Warner property CNN, which would be left out of the deal as an antitrust concession. Whether it's to Fox or to someone else, Peter Lauria of BuzzFeed argued that Time Warner will sell eventually, since it likely represents the best value for its shareholders at this point. Capital New York's Alex Weprin broke down several of the other potential buyers, and Peter Kafka of Recode said it will be bought by a company that wants to make a big bet on the pay TV business. The New York Times' Jonathan Mahler and Emily Steel analyzed the turnaround at Time Warner that's made the company so attractive. The New York Times' David Carr and the Lab's Ken Doctor both explained the climate of ever-bigger mergers and consolidations that has begun to swirl again around the media industry. They pointed to a couple of major rationales for these defensive moves — size yields negotiating power, and if you can't beat ’em, buy ’em — and noted that regulators don't seem to be a big obstacle: "FOR THE MOST PART, THE CURRENT GOVERNMENT HAS PASSED ON REGULATING POTENTIAL MONOPOLIES, AND AS CITIZENS, WE HAVE BECOME INURED TO THE CONSEQUENCES OF BIGNESS," Carr wrote. Finally, USA Today's Michael Wolff and Financial Review's Neil Chenoweth looked at two behind-the-scenes players on each side who are helping engineer this possible deal: Time Warner's Gary Ginsberg, in Wolff's piece, and Fox's Chase Carey, in Chenoweth's.
GOING BEYOND CLICKBAIT AND ITS BACKLASH: "Clickbait" has been one of this summer's ongoing topics of discussion in the media world, and The Daily Beast's Emily Shire examined the anti-clickbait movement — exemplified by The Onion's Clickhole and Twitter accounts like @SavedYouAClick — as evidence that people are getting wise to the premise of duping and manipulating readers through unnecessarily coy headlines. Vox's Nilay Patel said clickbait headlines still work (most of the time) because they're essentially games for the reader to play, and Poynter's Andrew Beaujon posited that the main problem with clickbait is not the headlines, but the disappointing content that goes with them. "And yet," he said, "the blame often falls more heavily on marketing than the people churning out stuff that sucks." Betaworks CEO John Borthwick provided some data on the connection between attention and sharing that's the foundation of most clickbait's popularity, and found that there are many readers who spend very little time on pages after clicking but share the article anyway, sharing essentially based on the headline alone. But beyond those headline-sharers and the people who read on and are disappointed with the content, there are also a significant number of people who spend substantial time reading an article and are also quite likely to share it. Borthwick urged publishers to spend more time attracting those kinds of readers, and Gigaom's Mathew Ingram described Borthwick's findings as two versions of the online world: one noisy, fast, and click-driven; and the other deeper, slower, less noticeable, but still widely read and shared. One of the most prominent sites built around the former model, Upworthy, reported late last week that by far their most viewed, shared, and closely read pieces are not their own editorial content, but their native ads. At Contently, Joe Lazauskas gave a few reasons for Upworthy's remarkable success with native ads: It likely pays to relentlessly promote those ads on social networks, and the type of blandly feel-good content that makes for the best ads is exactly the same type of content Upworthy's already producing in its editorial content.
WAS SI SCOOPING OR SUCKERED?: Sports Illustrated scored the biggest breaking sports news story of the year last Friday when it ran a first-person piece by NBA star LeBron James revealing that he would re-sign with his hometown team, the Cleveland Cavaliers. The essay was written as an as-told-to piece with veteran SI journalist Lee Jenkins. Deadspin, The Wall Street Journal, and the Cleveland Plain Dealer all provided some details about how the story came about: Jenkins got wind of James' decision on Thursday, pitched a first-person piece to his editors at SI, interviewed James and wrote the piece Thursday night, and handed it off to his editors on Friday. Deadspin reported that the idea for a first-person essay was first proposed by James' camp, but The New York Times reported that it came from Jenkins. The Times' Richard Sandomir criticized SI's strategy, saying the magazine gave up an opportunity to put some journalistic weight behind a big story. Said Sandomir: "the approach cast Sports Illustrated more as a public-relations ally of James than as the strong journalistic standard-bearer it has been for decades." In an online chat, The Washington Post's Gene Weingarten echoed the point, calling it an example of a journalistic mindset in which "being first is overvalued and being good is too often beside the point, or financially imprudent." Craig Calcaterra of NBC Sports questioned what exactly Sandomir was expecting SI to add to the story, characterizing it as commodity news as opposed to a substantial story crying out for in-depth reporting. Sandomir, he said, is "FETISHIZING THE BUSINESS OF _SERIOUS JOURNALISM_ AT THE EXPENSE OF UNDERSTANDING WHAT SPORTS FANS ACTUALLY CARE ABOUT, APPRECIATING HOW INFORMED SPORTS FANS ALREADY ARE AND ASSERTING THAT THE REPORTER’S HIGHEST AND BEST FUNCTION IS TO GET BETWEEN FANS AND THE NEWS AS OPPOSED TO DELIVERING IT TO THEM." Poynter's Sam Kirkland said it's still possible for SI to break the story this way and do deeper journalism on it as well. (Jenkins was in Cleveland this week reporting a feature on James' decision.) And Deadspin's Kevin Draper looked at the other reporters who scrambled to get this scoop.
READING ROUNDUP: A few other pieces to read from this week: — Industry analyst Alan Mutter pulled together some simple numbers to remind us just how dire the newspaper industry's situation is, and Temple University's David Boardman criticized the Newspaper Association of America's Carolyn Little's rosy speech and instead urged newspapers to drop to one day a week in print. Little issued a defense of her picture of the industry. — The U.S. Federal Communications Commission's public comment period on its proposed "fast lane" plans for Internet providers was supposed to end on Tuesday, but it was postponed until today because a surge of comments from net neutrality supporters overwhelmed its system. (The FCC passed 1 million comments this week.) The Washington Post's Brian Fung explained the proposal and backlash, and at The Guardian, Dan Gillmor urged net neutrality advocates to make their voices heard. — Capital New York's Joe Pompeo profiled the new Philadelphia-based online local journalism initiative by Washington Post/TBD/Digital First veteran Jim Brady, Brother.ly, and Brady talked with Poynter's Butch Ward about what he's learned about local news. — Finally, Nebraska professor Matt Waite wrote a thoughtful and important piece on the value of doubt in data journalism, with some ideas on how to better incorporate it.
Photo of Time Warner Center by AP/Diane Bondareff. Photo of bait shop by protoflux used under a Creative Commons license.
The latest round of small grants from the Knight Prototype Fund includes several projects and digital tools that could eventually prove useful to journalists. A developer from Grist wants to build tools to measure audience in ways outside of pageviews or clicks. Another project aims to create a better tracking system for court records in Massachusetts through a public database. Talkbox, a project from New York Public Radio, would repurpose old phone booths to create a two-way line between the community and the newsroom to help with reader engagement and reporting. All Prototype Fund grantees receive $35,000 to fund their ideas in the early stages. Each project goes through a prototyping workshop and instruction on human-centered design with the LUMA Institute. After that, teams have six months to work on their project before a demo day. (Obligatory disclaimer: Knight is a funder of Nieman Lab, though not through the Prototype Fund.) Here's the full list of 16 projects:
DIY STORYCORPS by StoryCorps (Project lead: Dean Haddock): Advancing the mission of StoryCorps, a national program that records, preserves and shares people’s stories, by developing a mobile app that allows anyone to create do-it-yourself interviews. DO PUBLIC GOOD BUTTON by Public Good Software (Project lead: Dan Ratner): Developing a tool that allows people to take action on important issues through news articles; for example, someone reading an article about drunk driving could click a button and connect with related charities and advocacy groups. ENGAGEMENT TOOLS by Grist (Project lead: Chip Geller): Allowing newsrooms to better measure audience engagement, beyond clicks and page views, by creating an open-source WordPress plugin that will measure “attention minutes” to determine how long users are interacting with content. FACTO_BOT (Project lead: Will Knight): Helping prevent misinformation on Twitter by developing software that identifies stories that have been modified, and alerts people who tweeted or retweeted links to these stories that content has changed. FILMSYNC APP by University of North Carolina (Project lead: Steven King): Creating an app that will connect people who are watching a news story or documentary on television with related content through a second screen app on their smartphones. GLOBAL I-HUB ICIJ (Project lead: Mar Cabra): Making collaboration on cross-border investigative stories easier by providing a secure, easy-to-use platform for reporters to communicate through Facebook-like status updates, threaded communications on specific topics, individual messaging and file sharing. MARKET ATLAS (Project lead: Jon Gosier): Scaling a data provider network that allows citizens to collect and share microeconomic data from countries in Africa that lack financial infrastructure; providing reliable, consistent financial data should encourage greater investment in the area. OPENSTREETMAP PLUGIN FOR OPEN DATA KIT by Humanitarian OpenStreetMap Team (Project lead: Kate Chapman): Allowing easier collection of open geographic data, even in places with connectivity issues, by combining Open Data Kit’s data collection with OpenStreetMap’s data community. PATIENTSASSEMBLE by PatientsLikeMe (Project lead: Chris Fidyk): Helping people with chronic illnesses interact with policymakers through open-source collaborative tools that will allow users to provide feedback and shape issues that are important to them. PILOT FOR SCHOOL by The Virginian-Pilot (Project lead: Shawn Day): Building a targeted digital system that will allow Virginia teachers to search newspaper content and use it to complement class curriculums; content will align with Virginia’s Standards of Learning and help students apply academic concepts to what's happening in their community. PUBLIC DATABASE OF MASSACHUSETTS COURT RECORDS by MassINC (Project lead: Steve Koczela): Allowing journalists and the public to better monitor court cases through an online filing and database system for Massachusetts court records. PUBLIC RECORD (ADVANCED EMERGENCY RADIO SCANNER AND REPOSITORY) (Project lead: Tal Achituv): Creating a tool that will allow journalists to better track current events and investigate past events; with the tool newsrooms can record interactions on police/emergency radios, set alerts and listen to archived content. QC TOOLS by Bay Area Video Coalition (Project lead: Carol Varney): Allowing media organizations, journalists and others to easily preserve analog video through an open-source video digitization app that is inexpensive and easy to use. TALKBOX by New York Public Radio (Project lead: Caitlan Thompson): Involving the community in news stories by repurposing phone booths in specific neighborhoods that will provide residents with a direct, two-way line to the New York Public Radio newsroom; a “Talkbox” can help with engaging new audiences or to get information in a neighborhood where a reporting project is taking place. THE LAST GRAPH (Project Lead: Ben Conners): Helping journalists engage with audiences by allowing readers to interact with the final paragraph of a story through a database of “actions” that lead to reader involvement on an issue; for example, the last graph of an article on air pollution could include an action that encourages readers to sign a pledge to use public transit more. VERITZA (Project lead: Djordje Padejski): Helping reporters more easily find story leads from public records through a web platform that allows users to create alerts on information in these records; the platform will do this by scraping and aggregating data and analyzing it for patterns and anomalies.
While the United States' newspaper industry has faced more rapid disruption than any other country's, it's also benefited from the world's most vigorous nonprofit journalism sector. But with rapid changes affecting the German newspaper business, a new media startup there aims to bring the ProPublica approach abroad, creating the country's first nonprofit investigative news organization. CORRECT!V, a data-journalism focused investigative organization, is being backed by the Brost Foundation by a grant of €1 million a year for three years. Brost was founded in 2011 via a €300 million euro gift from Annelise Brost, the wife of a German publishing mogul Erich Brost, most famous for founding German newspaper Westdeutsche Allgemeine Zeitung. Correctiv (all caps removed for the benefit of our readers) has ambitious plans. Like ProPublica in its early days, it plans to publish mainly through partner organizations, in multimedia formats, including TV and radio. Their seven-person team, which they hope will grow to 20 within the year, is well connected in German media — for example, founder and director David Schraven had been head of investigations at Funke Mediengruppe since 2010. Those connections will make placing Correctiv content in popular outlets easier; Schraven says they're already in talks to develop an investigative radio show with a major German station. In addition, they have plans for multiple books — printed books, ebooks, and at least one comic book, about a fascist terrorist group. "We are completely focused on data journalism," Schraven says. The team intends to compile and share large datasets that map people in power to the money behind them, collaborating with local open data organizations as well as other newsrooms. Correctiv's outreach will extend to education as well — senior reporters on the team will travel throughout Germany, helping journalists and "regular people" learn data journalism skills. "I don't know whether there's any newsroom in the world who does that," says Schraven. Being a first for Germany leaves it lots of room to define its territory and to learn from what's been tried elsewhere. "To be honest, there is no competition," says Schraven. "There are other newsrooms and guys around to do investigations, but a newsroom who is focused on this? There isn't anyone else." Before launch, members of the Correctiv team met with investigative nonprofits outside Germany, including ProPublica, the Global Investigative Journalism Network, and members of INN, to discuss the project. "They thought that we could tell our stories, get our stories, but they said we should be very careful on the funding side," says Schraven. "And they are right, this is the most important — to get the money." To that end, Correctiv hopes to diversify its revenue streams, including both additional foundation support and individual donors. Schraven says his goal for a few years down the road is an annual budget of between €3 million and €4 million a year. Ken Doctor has written for the Lab, foundation funding is rarer because foundations are rarer, which could make life difficult for Schraven. Stephen Weichert, director of Hamburg Media School's digital journalism program, says Germany doesn't have a long history of foundation-funded news organizations. "Unfortunately, we don't have a lot of wealthy philanthropists thinking of journalism as a sponsored field yet. We also don't have foundations like the Knight Foundation that push real money into journalism to stimulate innovation and entrepreneurship in the news industry," Weichert says. But that doesn't mean that noncommercial media is entirely new to Germany. Die Tageszeitung, a German political news outlet that focuses on small countries and outsider politicians, has been cooperatively owned since 1979. Editor-in-chief Ines Pohl says 14,000 people pay between €5,000 and €20,000 euros for a a spot among the paper's shareholders. "This whole crowdfunding idea is the birth idea of the taz," as the paper is known, says Pohl. "We had a thousand people paying for the first edition before it was printed."
Pohl says while German media needs an infusion of capital, there are two challenges inherent to accepting philanthropic money. First, the organization should endeavor to provide oversight that ensures editorial independence from their benefactor. Second, she says, "the funding in the beginning might be easier than funding over time." But both Schraven and Weichert agree that foundation funding could have a future in Germany. Weichert himself helped found VOCER, a media startup that is "completely financed by foundation money." Schraven believes that philanthropic support for Correctiv, and for all German news startups, will grow. "I'm talking to a lot of foundations that fund cultural stuff — museums, and other stuff," he says. "They see that we've got problems with our papers. They see that we've got problems with our news industry. I'm pretty sure I can convince some of them to fund us — to change their idea of funding." So, yes, Correctiv is Germany's first nonprofit devoted to data-centric investigative journalism. But Germany's news-minded citizens have long been familiar with important reporting supported by something other than circulation or advertising, even if the "nonprofit" classification (or _verein_ in German) is relatively new. In fact, in some ways, the narrow focus on big investigations may be more novel to a German audience. "Investigative journalism in Germany isn't so big. That has a lot to do with our privacy laws. It's much more difficult to really dig deep," says Pohl. "It is changing over the years, but the tradition isn't so big." Pohl pointed to the organization Netzwerk Recherche (roughly, "investigation network") as an example of growing interest in expanding investigative efforts in Germany. NR is, among other things, interested in working to assure German nonprofits the same tax benefits that similar organizations receive in the U.S. Weichert also sees investigative journalism as a potential growth area in Germany. "There have been a lot of investigative divisions established in the last few years within the newsrooms," he said in an email. "Furthermore, some newsrooms decided to build networks between traditional publishing houses and public service broadcasting authorities. One good example is the ongoing collaboration between Süddeutsche Zeitung, Norddeutscher Rundfunk, and Westdeutscher Rundfunk. A handful of the best reporters are working for together under the lead of Georg Mascolo the former editor-in-chief of Der Spiegel." Although some in the German media industry are critical of merging public funds and private dollars, Pohl says she believes, in the long run, collaborations between existing major outlets are more likely to be successful than startups, which are burdened by having to fill their coffers and build their brand simultaneously. But, she adds, if there was a mutually interesting project, she'd be interested in having the taz and Correctiv collaborate. She also has faith in Schraven as a leader. "He's not only a money maker, he's not someone who wants to do a cool startup business," Pohl says. "He's a true, true journalist." In the end, the real proof of what Annelise Brost's fortune can do for German journalism will be born out in the work Correctiv produces, and when it comes to getting stories, Schraven says he has more than enough whistleblowers lined up. In addition, the team has multiple data projects already underway, including "a broad overview of the mobster structures in Germany" with a database of over a thousand individuals, plus a healthcare project that's comparable to ProPublica's Dollars for Docs. If those projects pan out, Correctiv could stand as an example to future German donors of what happens when investigative journalism is supported. Next to corporately-funded Investigate! and crowdfunded Krautreporter, it could even be that Germany is seeing the beginning of a serious turn away from reliance on legacy media. "Some big papers in Germany are in trouble now, some are already closed. That's a new thing here," says Pohl. "It's kind of easy to get the money now for the funding, but to keep these organizations running — that will be the big, big challenge. And that will depend on the success these groups, like David's, will have."
Photos via the Correctiv Instagram account.
Rupert Murdoch's announced $80 billion pursuit of Time Warner this morning seemed like a bolt out of the blue to many. But the strong winds of consolidation make this kind of foray — and the others likely to follow — absolutely logical. Consider all the kinds of consolidation we're seeing done, or attempted, in the entertainment/news businesses. Last year was among the biggest in recent years for local broadcast consolidation, as Tribune, Gannett, and Sinclair, among others, bulking up and Local TV, Belo, and Allbritton taking the money and running. "The newsonomics of Time Inc.'s anxious spinoff") would be the only remaining "Time" — maybe a cosmically befitting result.) In May, AT&T consummated a deal with DirectTV, one that is now before Congress. Big is back, in a huge way. Of course, it never really went away. But post-Great Recession confidence and deep coffers — 21st First Century Fox has at least $5.5 billion in cash, and access to lots more; Goldman Sachs would finance the Time Warner deal if it were to happen — are now juicing it. But larger forces are shaping the quest for size. The digital disruption of the TV/film/video businesses is a prime driver. Consumers are moving madly from constrained, through-the-old-pipes broadcast to over-the-top products. The astounding American conversion to global _fútbol_ is in significant part attributable to ESPN mastering its WatchESPN mobile/web experience. Early reports show that such non-"TV" watching produced major new audience; about 3.2 million viewers tuned into WatchESPN’s app to watch the USA-Germany game, for example. (Poor Time Inc. even had to tell its staff not to stream matches at their desks — not because they should be doing real work, but because all the bandwidth was bringing the company's network to its knees.) So for this World Cup, masses of Americans learned what it means to "authenticate" through their cable suppliers to follow games on their iPhones and Androids. Now consider ESPN, owned by Disney, and its plans into the future. It commands the highest rates for cable and satellite coverage, about $5.54 per subscriber, by far the highest in the land. Still, though, the rat-a-tat-tat of cord cutting stories and advice, including yesterday's from the Journal's Geoff Fowler, will just get louder and louder — and more acted upon. Think about where this is going. ESPN, like Time Warner's HBO, wants to have it both ways: keep up its rich stream of cable fees _and_ offer an increasing array of direct-to-consumer products. The pipes companies, however consolidated or not, want to keep the lid on à la carte offerings as long as they can — and partake in some à la carte revenue themselves. (Witness Comcast's current one-off, pay-per-view selling of movies as just one example.) It's murky how this will all turn out. You could argue that the digital revolution puts consumers in the driver's seat, forcing an unbundling. But the bigger the pipes companies get — consider a merged Comcast/Time Warner and AT&T/DirecTV world — the more raw power they have to maintain the legacy business models as long as possible, and then to negotiate the most favorable deals in a cord-cut world. Today's attempted deal is, in large part, about that getting a negotiating edge on the other guy. Improve your deal by a few dimes on a cable customer or a revenue share and, over time, billions of dollars hang in the balance. Then add in the widening blur in emerging screens world. The Supreme Court's Aereo decision has bought the local broadcast chains some time. It laid to rest immediate doubts about the lucrative and growing retransmission fees the broadcasters get. They are now able to forecast the $7.6 billion in cable and satellite retrans fees by 2019. Of course, to maximize that revenue, big is the operative word. The broadcast consolidation has provided its own clout — an escalating battle of big vs. big vs. big. Big is completely logical from a corporate point of view. With Netflix, Amazon, Google (and then Apple, Yahoo, and hosts of smaller players) busting down the doors among TV, movies, and digital video, one question is how to manage the digital blur. We know where this is going, with digital video eating up the categories of "broadcast" and maybe even "movies" over time. The question is what the new ecosystem looks like. There, there's at least a three-part dance. There will be the pipes (cable/satellite) companies, still with huge power in the United States. There will be the studios, producing the bigger, mass video entertainments, like the 21st Century Foxes and Time Warners. Then there'll be the digital-native companies, stoked by the confidence that years of astounding digital business disruption builds. All the legacy companies — 21st Century Fox, Time Warner, AT&T, DirecTV, the big broadcast groups, among others — feel the hot uncertain breath of Netflix, Amazon, Apple, and Google on their necks. All of the digital giants are beginning to blast away at the traditional bundles, habits, and pricing. All are eating away at legacy companies' customers and cash flows. We see uncertain legacy responses like Hulu, which is clearly insufficient in staunching the tide. So the efforts at consolidation are as much defensive as offensive. A combined 21st Century Fox/Time Warner would produce about $65 billion in revenues. That's the size of…Google. Google's net income this year should be in the neighborhood of $14 billion. Figure a combined 21st Century Fox/Time Warner would come just below that number. The argument: In a Google-dominated world, you have to bulk up to compete. One other argument is the usually over-hyped "synergies." Acquiring CEOs like to put big numbers on the likely "synergies" in such consolidation. Murdoch's first number, subject to the due diligence that Time Warner CEO Jeff Bewkes has so far rejected, is $1 billion, a nice round one. In Amol Sharma's good take on the Fox pitch today in The Wall Street Journal, he quotes Janney Capital Markets analyst Tony Wible, who said last month about such a deal: "However improbable it may seem, one cannot overlook this megadeal given its immense financial benefits that dovetail with a number of strategic benefits," noting their combination of cable channels, studios, and rights to major sporting events. Translation: Synergy as clout. Yes, headcount can inevitably be cut, especially expensive corporate staffs, but redundancy isn't the major driver here. Market clout — if not quite market domination — is. Of course, there are a couple of other noteworthy players here: consumers and creatives. For creatives, this new golden age (quick, to-the-point Derek Thompson explanation here) of boundary-busting, digitally driven, high-quality TV has been an unexpected boon. They've got a number of big studios to pitch to and negotiate with, including Fox, TW, Scripps, and AMC, and now the Netflixes, Amazons, and Yahoos. Consolidation of studios could again rearrange the relative bargaining power of the creatives and the network execs. Maybe, digital disruption would open new doors, even if some older ones get boarded up, or maybe not. For consumers, it's a blur of big names and unclear implications. All the consolidators, in cable, satellite, broadcast, and studio, make a similar case: _We need efficiencies so we can invest in the digital products and technologies of tomorrow, which will produce consumer gain._ Usually included is a feint that pricing will go down, given all these efficiencies, though there's scant evidence of that. Americans already pay about twice as much for TV/internet packages as do our European cousins — and their broadband is usually both faster and regulated. "Mind your own business, Facebook and Google"), regulations and laws, regulators and politicians are a couple of decades late and many dollars short in confronting the nature of digital business domination. While the Europeans fight a rear-guard anti-monopoly battle against Google (which is even more dominant there than in the U.S.), the great business boundary-disrupting of digital media has perplexed and flummoxed those trying to figure out The Public Interest here. In many ways, "antitrust," "the public interest," and "local station diversity" all seem like artifacts of another age, waiting to be redefined. "The newsonomics of Comcast's deal and our digital wallets"). Secondly, there's the question of where all this business changes affects us as citizens. To be sure, most of this is about "entertainment," but that broad category also includes the kind of hard-hitting documentary and storytelling work that HBO, for example, excels in. Then there's the news component. 21st Century Fox preemptively said in its narrative on today's offer that it would split off the Time Warner-owned CNN — making the point that it wouldn't be forced into a shotgun marriage with archrival Fox News. That's pure Murdochian strategizing. Although it's hard to see _who_ would have the regulatory authority to review a Fox News/CNN merger (yikes!), Murdoch is paying attention to the court of public opinion and getting ahead of what could be/could have been a major stumbling block. Rupert — and son James — scoped out this one well, and don't think we've heard the last of it, despite Jeff Bewkes' immediate stonewalling. Time Warner — and much of the entertainment/broadcast landscape — is solidly in play. Meanwhile, let's remember that Rupert likes playing more than one game at a time. With persistent word that he wants the L.A. Times and may buy all eight Tribune newspapers if he needs to in order to get to it, we see the next stage of the Tribune modern day soap opera unfolding. Come Aug. 4, Tribune Publishing will finally be split off from Tribune Corp. If Rupert's people haven't yet had conversations on the acquisition, expect them to commence over the next several months. Let's remember he also salted away $2 billion in split-off News Corp, and Tribune papers may well be bought for a quarter to a third of that sum.
Photo by cncphotos used under a Creative Commons license.
It took almost a year and a half for nonprofit news site Eye on Ohio to get its tax-exempt status approved by the IRS this spring. Launched in the fall of 2012 by former business reporter Lori Ashyk, the site aims to follow the model of many smaller nonprofit news startups: reporting some of the investigative stories and statewide news that have fallen by the wayside as daily newspapers have shrank. Ashyk knows the stories of other news nonprofits who've had difficulty navigating the tax-exemption process at the IRS and seen their applications disappear into a kind limbo. "Compared to some of the other news organizations, it's not that bad," she said. But there might be new relief for at least some journalists looking to get into the world of nonprofit news. This month, the IRS introduced a new application that makes getting tax-exempt status not much more complicated than ordering a pizza online. What was once a 26-page form has been cut down to three, and groups will now only have to pay a $400 fee rather than $850 to apply. Not all would-be nonprofits will be eligible for the streamlined process: The new form is only open to groups with annual income of less than $50,000 and assets of less than $250,000. While that might seem like a low threshold, the IRS estimates around 70 percent of the organizations applying for 501(c)(3) status will qualify. And with the greater ease of use will come less government oversight: Some groups, such as the National Council of Nonprofits and the National Association of State Charity Officers, say the streamlined process is an invitation for groups to abuse tax-exempt status.
The universe of nonprofit news outlets has expanded in recent years as traditional media have shrank and some reporters have moved into more entrepreneurial roles. But one element of that growth has moved at a pace that fluctuates between erratic and glacial thanks to that 501(c)(3) approval process. In one of the more notable cases, it took the San Francisco Public Press 32 months to get approval from the IRS. Last year, a coalition of organizations including Knight Foundation and the Council on Foundations issued a report calling for the IRS to change its "antiquated and counterproductive" rules for granting tax-exempt status to journalism nonprofits. The IRS says it has a backlog of 60,000 applications, with most pending for at least nine months. The new 501(c)(3) process was one result of the scrutiny the IRS received for the way it reviewed political groups seeking tax-exempt status. The idea behind the new application is to help small nonprofits avoid that tax-exempt purgatory and to help an IRS with less resources. According to IRS Commissioner John Koshinen in an interview with Time, the change will mean the division that reviews tax-exemption requests will see 40,000 to 50,000 fewer applications. "If I was someone starting from scratch right now, I would certainly be looking at this. It's a lot easier than it has been," said Brant Houston, board chair at the Investigative News Network, which includes about 100 nonprofit newsrooms. The existing IRS review process for 501(c)(3) status is rigorous, but largely opaque, as organizations often received little information about the status of their application, Houston said. For many, that meant finding a fiscal sponsor to help manage the financial tasks in the early days of the startup, Houston said. "People spent years waiting to go through the process, and it was not entirely clear why some are waiting two years and others go through in four months," he said. While the new 501(c)(3) process is simpler than the past, it comes with a test. Any group wanting tax-exempt status through the new 1023-EZ form has to pass a 7-page, 26-question eligibility worksheet where just one "yes" answer disqualifies an organization. The other requirements are the annual revenue and assets ceiling. Big-name journalism nonprofits like ProPublica, the Center for Investigative Reporting, or The Texas Tribune crossed the $50,000 income hurdle a long time ago. But there's still a long list of small and growing organizations whose budgets fit that requirement, said Andy Sellars, a clinical fellow at the Cyberlaw Clinic at the Berkman Center for Internet and Society. "That $50,000 mark is well within the operating budget of a lot of these new organizations," Sellars said.
The Cyberlaw Clinic provides pro-bono legal aid to a number of nonprofit news sites, many of whom have a relatively small staff and limited budget, Sellars said. While efforts like the newly launched Marshall Project come with substantial financial backing from the outset, others are scraping together donations and using their own money. "This is a tremendous change for a lot of organizations, and I hope it means organizations can increasingly do this on their own as well," Sellars said. "It's one less obstacle going forward. But just because the application for becoming a 501(c)(3) is simpler doesn't mean news nonprofits can overlook the process. Journalism itself still isn't recognized as an activity eligible for nonprofit status. Most nonprofit news organizations instead put themselves in the category of education. It's not a perfect fit, but it's close enough. Here's the key line on "specific activities" on the 1023-EZ form:
Example 2. An organization whose activities consist of presenting public discussion groups, forums, panels, lectures, or other similar programs. Such programs may be on radio or television. Example 3. An organization which presents a course of instruction by means of correspondence or through the utilization of television or radio.Though the new 1023-EZ application offers an easier path to gaining 501(c)(3) status, the original form can be very instructional to would-be nonprofits, said Eric Gorovitz, a lawyer at Adler & Colvin who specializes in nonprofit legal issues. The new form is short on details and explanations, instead asking applicants to supply basic information under the "penalty of perjury." Earlier this month, Gorovitz wrote about the new form:
Unlike its much more detailed sibling, the Form 1023-EZ asks the applicant to attest to a series of conclusory statements about its governing documents, purposes, and activities, but does not require elaboration or attachments. Applicants using the new form do not have to provide any details about, for example, their relationships with insiders or their finances.In the longer form, applicants have to explain in detail what their organization does, how it will be structured, and how those things fit within the boundaries of what the IRS allows. That has the benefit of sharpening an organization's focus and better understanding what they are allowed to do, he said. For journalism nonprofits, that might mean addressing the issue of revenue sources (the ratio of advertising revenue and subscriptions to other sources), and being explicit about any political activity (editorial-style candidate endorsements, for example). "It probably makes it easier for small journalism enterprises to get started," Gorovitz said of the new form. "The question is, when they get bigger, doing things they think are fine, but have not learned they aren't fine." (Interestingly, Gorovitz also points out another side effect of the new 1023-EZ might be a loss of detailed background information for journalists _investigating_ nonprofits.) At places like Eye on Ohio, it's likely the new 501(c)(3) application would have helped things get up and running faster. Though the process of getting tax-exempt status was long, Ashyk says it was useful because it helped shape her strategy for the site. Because of a question she received from the IRS on community involvement, Ashyk is planning to set up a community advisory board to help guide the site. Getting the seal of approval from the IRS makes a big difference in the livelihood of nonprofits like Eye on Ohio, but of course it's just the beginning of the work. "We're very much in a startup stage, even though we've been around for over a year," Ashyk said. "Things haven't really changed that much. We'll be in charge of our own funding now — but since we don't have any it hasn't made a whole lot of difference yet."
Photo of tax forms from the Brookline Library used under a Creative Commons license.
Let’s say you have a research topic, and maybe even an angle. You dive in by reading the canonical classics, all of which seem to cite one other, and maybe some of the most recent debates. Now what? Or perhaps you’ve been studying the same topic for years and feel stuck. How can you find a fresh take on a stale debate? By this point, you might have exhausted the help that discovery platforms like Google and Facebook can provide. Google will reveal the most-cited works (especially on the more specialized Google Scholar or Google News), and Facebook might yield the ones your friends or subject experts value — but there's no easy way to break out of the networks that define these platforms. Libraries provide content-based discovery portals, which offer one way out, but they often give you too much to wade through, with clunky interfaces and varying levels of relevance.
These limitations are not exclusive to serious researchers. News consumers frequent the same platforms, and they are subsequently directed to the most cited, the most retweeted, and the most relevant keywords. Network-based, big data methods for sorting the wheat from the chaff carry promise, but they rely on their own assumptions about value (mostly based on what's already popular or viral), and they risk boxing out hidden gems and chance encounters in the process. In other words, the filter bubble affects history scholars as much as casual news browsers — and scholars' careers often depend on unearthing something rare and different. As a result, some researchers in the humanities and library worlds are looking for possible paths out of the research bubble for historians and scholars. By looking towards existing browsing and searching habits in both physical and digital environments, they hope to help scholars never miss the information they need — a problem that carries great weight in the news world as well. The goal, in effect, is to increase the role of serendipitous discovery in online research. Old-school types are nostalgic for the days of walking into the library stacks and seeing what books catch one’s eye; digital tools often have trouble enabling this sort of accidental discovery, where a user finds something valuable that they didn't even know they wanted. But serendipitous encounters don’t have to be analog; if anything, digital tools should be able to foster _more_ serendipity, since they can effortlessly reorder categories, effectively rearranging stacks based on the researcher’s avenue of inquiry. But how would one engineer serendipity — and can we even call something serendipitous if it was engineered? WHAT IS SERENDIPITOUS? Serendipity can be loosely defined as a chance encounter or an accidental discovery that leads to added insight or value. It seems random, but this definition goes beyond merely injecting randomness into an algorithm. One definition proposed by Gary Fine and James Deegan is “the unique and contingent mix of insight coupled with chance.” The “insight” part is crucial. Serendipity requires a user who is ready to make connections that aren’t obviously there — making it a particularly difficult problem for a computer. In attempting to classify serendipity, Stephann Makri and Ann Blandford see three facets: how _unexpected_ was the encounter or connection; how much _insight_ did it require from the person making it; and how much _value_ did it give them? Whether or not this works for every instance, it shows the variety of ways in which one can define an encounter as serendipitous — and how often a seemingly lucky event was in fact somewhat directed. Finding a fortuitous article on Facebook or making an important contact at a conference still require following the right person or attending the right conference. Anabel Quan-Haase, a professor at the University of Western Ontario, has been researching the role of serendipity in the research process for humanities scholars. She sees the process of serendipitous discovery as a function of a researcher’s “prepared mind,” the first step for serendipity. In order for any accidental connection to occur, the user must be ready for it, which makes the timing of the encounter crucial. This may set back digital tools that are geared towards highly targeted search, where a user is already in the mindset of looking for a specific item. Beyond a prepared mind, a user must notice the find, stop to review it, extract the information, and finally return to it for future use. In each of these steps, design and user experience play a crucial role — even beyond the initial question of engineering a random-but-relevant encounter. ENGINEERING SERENDIPITY A user must be mentally prepared for an accidental insight, but it’s unlikely that they’re thinking “I’m feeling serendipitous today.” So standalone platforms that encourage random discoveries could limit the ways in which serendipity can integrate into our digital lives. But Quan-Haase says that adding serendipity to targeted search would make little sense, given how long we have spent honing the search experience for specific results. Instead, perhaps, we could augment rather than replace search — something like a “serendipity widget,” for example, in the sidebar of a search interface. Such a widget could display articles at random pertaining to the keyword — or perhaps it could target a little further. One could envision a system that looks at your past searches and attempts to blend them with your present one, or grabs exclusively from sources you don’t normally peruse. You might call this a separate facet for targeted discovery rather than a truly serendipitous encounter (again, there are levels of serendipity), but it could serve the goal of finding what you didn’t know you wanted. Many users in Quan-Haase’s studies cite Twitter as a serendipitous platform. I for one have found many of my most useful sources while randomly browsing Twitter, sometimes after hours of fruitless searching in specialized databases. I know I don’t see every tweet by everyone I follow, but I also know that some of the most inspiring tweets or links won't be found by simple heuristics like most-retweeted or most-favorited — so I often follow my firehose in hopes of a nugget of gold, and quite often I am not disappointed. This might suggest that Twitter might be a more serendipitous platform than Facebook or Google, which emphasize more targeted customization and personalization. It — along with the Twitter API's ease of use — also might explain why many organizations take advantage of Twitter to create whimsical bots that inject a bit of randomness into your feed. For instance, the Digital Public Library of America's DPLA Bot grabs a random noun and uses its API to share the first result it finds. Lamenting that “the API has no means of calling up totally random items,” the DPLA Bot aims to “infuse what we all love about libraries — serendipitous discovery — into the DPLA.” For now though, this random dive into digital stacks is not personalized, which means you could be in the wrong section of the library.
Crazy about chivalries? Then see "The history of chivalry /" at http://t.co/gJDSXVz47W— DPLA Bot (@DPLAbot) July 12, 2014The British Library's Mechanical Curator similarly posts random resources with no customization, but its special focus on images in the library's 17th- to 19th-century collections gives it a lighter and more visual feel. More for curiosity seekers than serious researchers, the library suggests on its blog that “the pursuit of knowledge is not the point.”
Photo: Image from ‘Chambers’s Alternative Geography Readers. Standard IV.(-VII.)’, 000655321 Author:… http://t.co/0HhdHhBZqt — Mechanical Curator (@MechCuratorBot) July 15, 2014The TroveNewsBot, built on the National Library of Australia's 370 million resources, features more interactivity. Send the bot any text, and it will dig through the Trove API for a matching result:
@mailbackwards 28 Sep 1992: 'Serendipitous combination September 25′ http://t.co/gg4kmwliOT— TroveNewsBot (@TroveNewsBot) July 15, 2014It doesn't stop there: adding #earliest gives the first result in their collection, #latest the most recent; you can also limit the query by year and location. Give the bot a URL and it will fetch the link’s keywords and query the API with them, allowing TroveNewsBot to “respond” to any article on the web. The bot strikes a nice balance between targeted search and random luck, although your luck starts to run out if your interests lie far from Trove's collections (primarily, Australian newspapers published between 1803 and 1954). Regardless, it's good fun, as exemplified by the TroveNewsBot's guide to child rearing. DESIGNING FOR SERENDIPITY Veering away from Twitter, one tool that seems to get serendipity right is Serendip-o-matic, a project of the One Week | One Tool initiative. Brian Croxall explains that due to the project's one-week time frame, experimentation and play were baked into the development process, and emphasized at the outset over feature-complete engineering marvels. Rather than using language like “select” or “upload,” they suggest that you “grab some text.” When you hit “Make some magic!” the tool peruses digital collections from the DPLA, Trove, Europeana, and Flickr, returning a series of multimedia documents that hopefully broaden your horizons to the topic at hand. As might be expected, some results are more serendipitous than others. It’s also hard to know why a certain image or document was selected, which could otherwise be helpful in directing future searches. All the same, Serendip-o-matic’s playful setup and language prime the user well for making accidental discoveries. These tools (along with others, such as the EuropeanaBot) are primarily targeting digital humanists and historians who are in a rut, but they each have their own insights about what is serendipitous versus simply random. It is difficult to plan for unplanned discoveries, especially so for a computer. Events are only serendipitous in hindsight, consisting of varying levels of planning versus dumb luck. But it seems quite possible to design for serendipitous discoveries, and to help put a user in the mindset for it.
Imagine a "serendipity widget" in your Facebook or Twitter feed, or on the sidebar of a New York Times article. The number and variety of signals that could go into it are endless, and many would bring their own biases. All the same, it would at least offer another pathway into news that relies on different assumptions, adds a sense of playfulness, and reminds a user that there's more than one way to slice content. Injecting randomness and play into recommendation systems could be valuable in its own right, but it seems especially timely given the current moment's intense focus on content personalization. We all want relevant information, but perhaps you want to see something that users _unlike_ you liked, or something no one has ever stumbled across ever before. Controlled randomness could be one small way to push back on hyper-curation.
Photo by Bob Gaffney used under a Creative Commons license.
WFMU received grants totaling $400,000 from the Geraldine R. Dodge Foundation to launch the product, the first of which was issued last July. A portion of the funds were used to bring on new developers to build the platform, and the station is scheduled to roll-out a prototype, which will be tested by WFMU and two other New Jersey radio stations, in the next month or two. It plans to continue to develop more features and add additional organizations through 2015, Freedman said. "We’re approaching this as an open-sourced project because so many media organizations grapple with the exact same problems, even in different media," Freedman said in a telephone interview. "I’ve been struck with how digital journalists are facing a lot of the same issues as broadcasters."
Why The New York Times and The Washington Post (and Mozilla) are building an audience engagement platform together
Indeed, Molly de Aguiar, director of media grants at Dodge, said the foundation agreed, and believes news organizations could benefit from Audience Engine, even though WFMU isn't primarily news-focused. (They do offer some public affairs programming.) "We are very energized by WFMU’s creativity and resourcefulness," de Aguiar said. "There’s so much that they do with their own community of listeners that can be shared with news organizations in New Jersey." Throughout the development process, WFMU has received advice from Union Square Ventures, a venture capital firm that was an early investor in companies like Twitter and Tumblr, and the Open Source Initiative, a group committed to promoting open-source software. They also hired Open Tech Strategies, a consulting firm, to advise them at the start of the project. One of WFMU's top goals for Audience Engine is to make it self-sustaining. To that end, the station is creating a for-profit subsidiary to assist customers in getting on board, from customizing the platform to training their staff. "We hope that we will be able to create a revenue flow that we will be able to keep developing it and keep the project up and running without constantly having to go back to foundations for money," said Freedman. Audience Engine will include the core functions of the WFMU system — fundraising and community engagement. For example, during WFMU pledge drives, fundraising widgets not only pop up on the website, but are also embeddable so staff members and listeners can encourage donations on WordPress, Blogger, Tumblr, and other websites. Users will also have the ability to easily post audio, video, and text. Other top Audience Engine features include a comment system that Freedman says allows "audience members to communicate and chat with each other without the quality of that discussion going down into the pits of hell, like a YouTube comment thread." He's also excited about the platform's annotation tool, which he hopes will add more permanence to user contributions. "There are great, substantial, creative contributions that people make related to the content that we, as a publisher, are putting out there," Freedman said. "What we want to be able to do is to take those great contributions and annotate our content with them so that it’s not just a stray comment in the wind that, after the show, is lost forever. We want to be able to permanently attach it to the record of the content that we’re playing."
Audience Engine is likely to be most appealing to small- and medium-sized organizations, but even so, Freedman recognizes that each newsroom might not need or want the entire suite of tools. "We’re trying to make it modular, so people can just pick and choose and use pieces of it until hopefully they're comfortable with just using the entire thing," he said. Others, like Mark Maben, general manager of WSOU and one of the participants in the Audience Engine pilot, are drawn to Audience Engine because it combines so many functions — content creation, audience engagement, _and_ fundraising — into one system. "Any avenue that can be done to bring in donations to the station, that’s good for everybody," he says. "There’s not a noncommercial station out there that’s going to say no to something like that."
Photo of an engine by Ersu used under a Creative Commons license.
Just five months after opening up its Data Store — which sells some of the big datasets its reporters produce for stories and projects — ProPublica says it's generated "well over" $30,000 in new revenue. That figure comes from ProPublica president Richard Tofel in an interview with Southern Methodist University journalism professor Jake Batsell. Since they opened up shop in February, Tofel says more than 500 data sets have been downloaded.
“The 500 downloads, that’s probably more important from a mission standpoint,” Tofel told me. But those who have paid for the premium sets — so far, mostly companies and consultants from the medical industry — may well become repeat customers down the road, when it’s time to update the data. “I think we would consider it a successful experiment in that sense,” he said.making more data publicly available. Here's ProPublica data reporter Ryann Grochowski Jones on the store — "Our databases are a finished product":
If you look at the list of most popular stories on New York magazine's front page, you'll see in the No. 2 slot this piece by Ann Friedman, "Why I'm Glad I Quit New York at Age 24." And if you click through, you'll see that it was published last September. Why did it resurface? New York pushed it out on Facebook yesterday, as if it were a new story. And why did it do that?
@jessmisener https://t.co/iGkE9J0evm -- Stefan Becket (@stefanjbecket) July 14, 2014
@jessmisener ~75% of our Facebook followers didn't see it the first time. -- Stefan Becket (@stefanjbecket) July 14, 2014
@jessmisener (i.e., we have a lot more now) -- Stefan Becket (@stefanjbecket) July 14, 2014New York now has over 950,000 likes on Facebook. And, as publishers are keenly aware, most of your Facebook fans never see what you publish anyway. Good on New York for finding a new audience for an old piece. But I also want to highlight something useful that the BBC is doing to counteract the impression that years-later bursts of social activity can give — that an old news event is happening right now. As sighted by Meg Pickard, formerly of The Guardian (and a few days earlier by this guy):
Not sure how long this notification's been on BBCNews. Should reduce confusion when old stories surface. Context=good pic.twitter.com/o3KDS0db8e -- Meg Pickard (@megpickard) July 13, 2014
One reason BBC context flag is good: this story comes up fairly regularly, after a flurry of views from blogs/SM etc http://t.co/HkHXnIFRM6 -- Meg Pickard (@megpickard) July 13, 2014(The bra story in question is about Brandi Chastain at the 1999 Women's World Cup and it was originally posted in April.) Leaving New York is eternal, and I imagine Chastain's story will continue to be told for a long time too. In these cases, their newness or oldness isn't crucial. But that's not true on all news stories. I'm glad the BBC now has a system for highlighting when a popular news story isn't necessarily a new one.
In case you spent Friday under a rock, and that rock had no wifi and, like, only one bar of signal, all-world basketball player LeBron James announced he would be returning to his old team, the Cleveland Cavaliers. Breaking the story was Sports Illustrated, which published a first-person essay by James "as told to" SI staffer Lee Jenkins. It was quite a get for SI; at the moment, the story's been tweeted over 140,000 times. But Richard Sandomir of The New York Times didn't care for it much, writing a story ominously headlined "Getting the Scoop, but Not Necessarily the Story; Role of Sports Illustrated in LeBron James's Announcement Raises Journalistic Questions."
…armed with the biggest news of the day, the magazine presented it as a 952-word statement on its website from the King, not a full-blown news story with context and breadth… News value aside, the approach cast Sports Illustrated more as a public-relations ally of James than as the strong journalistic standard-bearer it has been for decades. And while James’s words may have been all that the sports world wanted to hear, the magazine should have pressed for a story that carried more journalistic heft.That seemed a bit much to me; I seem to remember the Times letting Angelina Jolie tell her story in first person when she had a big announcement to make. But baseball writer Craig Calcaterra pushed back better than I could:
This is crazy. It’s an instance where Sandomir and the Times — who I think are fantastic most of the time, by the way — are fetishizing the business of _Serious Journalism_ at the expense of understanding what sports fans actually care about, appreciating how informed sports fans already are and asserting that the reporter’s highest and best function is to get between fans and the news as opposed to delivering it to them. Question: what, apart from the name of the team LeBron James chose and his reason for choosing it, do people interested in this story either not know or actually care about? What sort of “journalistic heft” does Sandomir think should have been added to this to “serve the reader” better? Jenkins prefacing the actual news with "James, 29, from Akron, has played for Miami since the 2010-11 season," would not have added journalistic integrity here. It would have been byline-justifying filler. Everyone tuning in to this story knows what’s happening. Sports Illustrated and Jenkins provided them with the one thing they didn’t know: where James was going and why. If there is any concern about larger context here, it can and will be addressed by SI sidebars, bullet-pointed, fact-based graphics and, most importantly, an in-depth story from Jenkins about his conversations with James which provides deeper context. All of which, I assume, have either already been published or will soon be. [...] [James announcement is] a big event, sure, but at bottom it’s functionally equivalent to a team issuing a statement that it placed a player on the disabled list. That day’s starting lineup. A simple bit of data. A commodity. And just as sports teams and leagues are increasingly bypassing the press in order to release that sort of commodity news directly to fans via their Twitter feeds or in-house news operations, LeBron James could have very easily tweeted that he was heading back to Cleveland to his 13.6 million followers. Or, like he did back in 2010, could’ve said it on some TV show _cum_ P.R. festival he created for himself. Indeed, it’s amazing to me that Sports Illustrated even got what it got here and they should be credited for getting that much. I didn’t need more than that yesterday. I’m more than happy — hell, very, very eager — to wait for Jenkins’ in-depth followup to all of this. I bet it’ll be incredible.
Photo of LeBron James in 2009 by Keith Allison used under a Creative Commons license.
It was well past midnight the morning of June 14 on the East Coast of the United States when Los Angeles Kings defenseman Alec Martinez slammed home the Stanley Cup-winning goal deep into the second overtime. In hockey-mad Finland, the deciding game was played in the dead of night, with the final goal coming after 7 a.m. local time. But the first game story many Finnish fans read that morning wasn't written in Los Angeles, or even Helsinki — but rather Sydney, Australia where it was already past 2 p.m. Sunday when the Kings' celebration began.
Earlier this year, the Finnish news agency STT-Lehtikuva shifted its overnight shifts from its Helsinki headquarters to Sydney. It's only one of many news organizations around the world that now use time zones and a kind of geographic arbitrage to their advantage, covering breaking news and managing their websites during the overnight hours. With mobile devices booming — and with a quick scroll through headlines while still in bed becoming increasingly the norm — readers are demanding news content earlier and earlier, and that doesn't line up with how most newsroom schedules have traditionally been structured. As The New York Times' internal innovation report complained, "the vast majority of our content is still published late in the evening, but our digital traffic is busiest early in the morning." "A couple of years ago, it was enough to [have fresh news] at eight or nine," Mika Pettersson, STT's editor-in-chief and CEO, told me. "Once they were at work, they'd open their desktops at eight or nine. But now they open their mobiles when they get up. News agency clients demand us to be very alert in the morning, reporting what has happened during the night." Outlets with global reach, such as The Wall Street Journal and The Guardian, have staffers working around-the-clock in locales from New York to London to Hong Kong to Manila to Sydney. Other European organizations, like Germany's Bild, have bureaus in Los Angeles to man the overnight hours. And there are five European news organizations, including STT, that have their overnight staffs working in the headquarters of the Australian Associated Press in suburban Sydney. (The AAP's overnight staffers are based in London.) STT sent four Finnish journalists to Australia in May. Along with their improved ability to cover overnight news, STT will save up to €60,000 ($81,800) annually in overtime expenses because it no longer has to pay Helsinki reporters to work overnight shifts; that's enough to hire an additional staffer, Pettersson said. And the Australia-based reporters are able to do virtually anything a reporter in Finland could do. They're not only covering sports, but things like crime and political news as well. The only practical limit, Pettersson said, is "if something really big happens in the capital, in the Helsinki area, you can't really go out from the office and cover news. But this happens so rarely — it happens once every two years or something." A TRAINING OPPORTUNITY At many news organizations, these international "overnight" jobs are plum positions — who wouldn't want to trade a Berlin winter for one in Los Angeles? — but the postings are often temporary, and staffers will be there for as short as a few months before heading back to their home base. Rotations at the German tabloid Bild's Los Angeles office typically last about five months, and a stint working the overnight shift in California is seen as a way of training journalists in digital reporting. Bild bills its Los Angeles coverage as Bild.live@Night, and it features a banner atop its homepage with a news ticker every night broadcasting the fact that the site is being updated from California. The L.A. team even has its own Twitter account. (It just retweeted this photo of Rihanna celebrating in Rio with Lukas Podolski and Bastian Schweinsteiger.) As a result, the five Bild staffers in L.A. need to be versed in all areas of online journalism — from social media and SEO to video and photography. "It's not just about the nighttime traffic or saying we're in L.A., but it's also about a change management thing — to say, of course everybody knows what the future is like and that we need to change, and need to say that we're investing into digital," said Daniel Böcking, deputy editor-in-chief of BILD.de. "But the easiest way is to say, I'll just go to L.A. and do online for five months and once I get back, I am able to do everything, because we are such a small team [in Los Angeles], I need to learn everything." A GLOBAL APPROACH Though Bild is Germany's most-read newspaper, its audience is fairly localized in Germany, so it's enough to have a small group of staffers in Los Angeles edit their website until Berlin wakes up. More complicated is an organization like The Guardian. With editions aimed at the U.K., U.S., and Australia, Guardian staffers on three continents are always managing their homepages. And though things can change based on breaking news, here's how the Guardian websites are managed over the course of a typical day:
UNITED STATES * 7 a.m. EST to 7 p.m. EST: New York * 7 p.m. EST to 9 p.m. EST: London (Midnight BST to 2 a.m. BST) * 9 p.m EST to 2 a.m. EST: Sydney (11 a.m. AEST to 4 p.m. AEST) * 2 a.m. EST to 7 a.m. EST: London (7 a.m. BST to noon BST) UNITED KINGDOM * 7 a.m. BST to 1 a.m. BST: London * 1 a.m. BST to 7 a.m. BST: Sydney (10 a.m. AEST to 4 p.m. AEST) AUSTRALIA * 7 a.m. AEST to 7 p.m. AEST: Sydney * 7 p.m. AEST to 7 a.m. AEST: London (10 a.m. BST to 10 p.m. BST)The key to a system like this is ensuring a smooth handover between the different global desks — you don't want editors repeating tasks or making things difficult for one another, said Wolfgang Blau, The Guardian's director of digital strategy. One of the most difficult challenges editors face is deciding what is newsworthy in other countries. The Guardian, for instance, can send push notifications specifically to certain regions, and Blau said some of the toughest decisions are when editors must decide when to send global alerts, regional alerts, or no alerts at all. "It's not easy to have news judgment for a country you haven't been living in for a long time," he said. "You can read as many publications from that country and wire services and observe social media from that country, but it still requires a very skilled group of editors to do that well." MOVING BEYOND THE BASICS With print editions in Europe and Asia, The Wall Street Journal has had editing presences around the globe for decades, and with the advent of the digital age it adapted its London and Hong Kong headquarters to also be able to run its website.
But in the past year, the Journal has staffed up those bureaus to create global editing hubs in conjunction with its headquarters in New York. Now, the Journal is running video, graphics, and social media desks around the clock. In the past, more of the Journal's web operations were U.S.-oriented, so there would often be a queue to get content to editors in New York or delays in getting social or interactive components added to stories. Through these hubs, the Journal has also set up global reporting teams around certain coverage areas like mergers and acquisitions or technology. "Right now, as news breaks, it can immediately be edited and delivered to all appropriate platforms at the right time through these hubs," said Almar Latour, the Journal's executive editor. “We’re committed to growing our audience and one way in which we can grow our audience is by reaching international audiences, by reaching readers outside the U.S., and on top of that we are shifting to becoming a much more digital operation than we’ve ever been before," Latour said. "That requires us to be alive as a news organization and switched on as a news organization 24/7.”
Photo by leoplus used under a Creative Commons license.
Just like the Brazilian soccer team, @ReplayLastGoal is leaving the World Cup early.
Twitter suspended the account that automatically tweeted out a video and GIF of every World Cup goal, according to a tweet sent by Xavier Damman, who developed the Twitter bot.
.@ReplayLastGoal has just been suspended by @Twitter. That was fun while it lasted. Thank you all for your support and kind words. #fairuse -- Xavier Damman (@xdamman) July 11, 2014In late June, Damman tweeted that he had received a takedown notice from Twitter, but the bot continued to send out Tweets through the semifinal games earlier this week. FIFA, soccer's governing body, and the TV networks that own the rights to the games have been vigilant about removing unofficial GIFs, videos, and images of the World Cup games. At Recode, Peter Kafka, who first wrote about @ReplayLastGoal being removed, questioned how Twitter will handle instances like this in the future:
I do wonder how Twitter will approach this stuff for other big global sports events. Right now, the company’s approach is to leave anything and everything up until it gets DMCA takedown requests, more or less like YouTube. Unlike YouTube, however, Twitter doesn’t seem to have an expedited process available to let copyright holders pull stuff off the site. In ReplayLastGoal’s case, for instance, it seems to have taken Twitter 11 days to take the account offline. But Twitter is also the same company that’s basing much of its sales strategy around the idea that it’s working with TV programmers, not against them. One of its highest-profile ad products, for instance, lets programmers take sports highlight reels and turn them into ads minutes after they run on TV. That pitch may be harder to make if those highlights are already up on Twitter.
THIS WEEK'S ESSENTIAL READS: The key pieces this week are NYU's Jay Rosen (in an article and an interview) on Facebook's legitimacy and control, journalism professor Alberto Cairo on solving data journalism's current crisis, and Cornell's Tarleton Gillespie on algorithms, journalism, news judgment, and personalization.FACEBOOK, ONLINE RESEARCH, AND CONTROL: A week after Facebook's experimental study on News Feed content and user emotions initially prompted an uproar, observers continued to talk about its implications for research, social media, and control. A privacy group filed a complaint with the U.S. Federal Trade Commission late last week, and the journal that published the study published a formal "expression of concern." Others pushed back against the outrage at Facebook over the study: Microsoft researcher Duncan Watts argued that our emotions are being manipulated all the time by marketers and politicians and said we should insist that companies like Facebook "perform and publish research on the effects of the decisions they're already making on our behalf." Likewise, Forbes' Jeff Bercovici said Facebook's study was much more innocuous than it's being made out to be. Former Facebook data scientist Andrew Ledvina questioned why there was so much outrage about this particular study when Facebook conducts constant experiments on user behavior. The only change Ledvina saw emerging from this episode was not that Facebook would stop doing these types of experiments, but that it would stop making them public. Similarly, Microsoft's Andrés Monroy-Hernández said the incendiary tone of the discussion surrounding this study makes him more reluctant to share his own research publicly. Blogger and developer Dave Winer this incident damages Facebook by destroying its users' illusion of an organically generated News Feed: "Facebook just broke the fourth wall. Of a sudden we see the man behind the curtain. And it's ugly and arrogant and condescending and all around not a nice feeling." On the other hand, Gigaom's Tom Krazit said most users already knew that Facebook manipulates their News Feed, but we don't know how they're doing it, so we can no longer consider it a useful source of news. NYU's Jay Rosen argued late last week that Facebook has a very "thin" form of legitimacy for its experiments because of its lack of attunement to ethical concerns, and in an interview this week, he warned that Facebook needs to be careful not to derive false confidence from its dominance of the social media market. "'WHERE ELSE ARE THEY GOING TO GO?" IS A LONG WAY FROM TRUST AND LOYALTY. IT IS LESS A DURABLE BUSINESS MODEL THAN A STATEMENT OF POWER," he wrote. From a bigger-picture viewpoint, Stanford professor Michael Bernstein said researchers need to rethink how they apply the principle of informed consent to online environment, and Gigaom founder Om Malik looked at the responsibility that needs to come with the unprecedented level of data and automation that's involved in modern life.
IMPROVING DATA JOURNALISM WITH EDUCATION: A couple of conversations this week converged on two important issues facing the news industry: data journalism and journalism education. Miami journalism professor Alberto Cairo diagnosed the underwhelming output at some of the prominent new data-oriented journalism sites, concluding that "EVEN IF DATA JOURNALISM IS BY NO MEANS A NEW PHENOMENON, IT HAS ENTERED THE MAINSTREAM QUITE RECENTLY, BREEZED OVER THE PEAK OF INFLATED EXPECTATIONS, AND PRECIPITOUSLY SANK INTO A VALLEY OF GLOOM." Cairo made a set of prescriptions for data journalism, including devoting more time, resources, and careful critical thinking to it. At Gigaom, Derrick Harris added that data journalism could use more influence from data science, especially in finding and developing new datasets. And sociology professor Zeynep Tufekci used this week's World Cup shocker to look at flaws in statistical prediction models. In a weeklong series, PBS MediaShift took a deeper look at what journalism schools are doing to meet the growing demand for skilled, critically thinking data journalists. The series included an interview on the state of data journalism with Alex Howard of Columbia's Tow Center, tips from The Upshot's Derek Willis on "interviewing" data to understand it and find its stories, a look at how journalism schools are teaching data journalism, and practical suggestions of ways to incorporate data into journalism education. Elsewhere in journalism education, the American Journalism Review's Michael King examined enrollment declines at American journalism schools, noting the tricky question of whether these declines are a leading or lagging indicator — something primarily indicative, in other words, of journalism's last several years or next several years. And David Ryfe, director of the University of Iowa's journalism school, looked at the difficulty of fitting the dozens of skills desired by employers into a relatively small number of journalism courses.
GOOGLE RAISES CENSORSHIP CONCERNS: One story on Google and censorship that came to a head late last week: In response to a May ruling by a European court, Google began removing pages from certain European search results per requests filed based on irrelevant or false information. By last week, that began including news articles, most notably a BBC column on Merrill Lynch, but also several other articles. In a pair of helpful posts, The Guardian's Charles Arthur explained the removals in general, and Marketing Land's Danny Sullivan explained the removals of news articles. Google reversed its decision to remove several links to Guardian stories as British news organizations criticized Google's implementation of the law. Gigaom's Mathew Ingram and The Next Web's Paul Sawers both posited that Google, which opposed the ruling, is enforcing it in deliberately draconian so as to create a controversy about the censorship it entails. But Danny Sullivan countered that it's highly unlikely that Google is doing this to sabotage the ruling, since the links it's removing span much broader than easily outraged media outlets.
READING ROUNDUP: A few other bits and pieces going on during this slow week:
— The Wall Street Journal celebrated its 125th anniversary this week with an archive looking at how they've covered big news events in the past as well as a special report. The Lab's Joseph Lichterman took a closer look at the Journal's anniversary offerings, and Capital New York talked to Journal managing editor Gerard Baker. The Atlantic's Adrienne LaFrance, meanwhile, had a fascinating look at the Journal's design through the years.
— The Pew Research Center released a study on the declining number of reporters covering U.S. state legislatures and what's being done to fill the gaps. The Lab's Joseph Lichterman wrote a good summary. — This week's handiest piece: Sarah Marshall's summary of the tips Johanna Geary, head of news at Twitter UK, gave to British journalists about using Tweetdeck as a reporting tool. — The U.S. National Security Agency documents leaked by Edward Snowden revealed another finding on surveillance this week: The Washington Post reported that the number of ordinary American Internet users in communications intercepted by the NSA far outnumber the legally targeted foreigners. The Intercept also reported on several prominent Muslim American academics and activists who were monitored by the NSA. The Intercept's Glenn Greenwald talked to Wired about why the story is important, while PandoDaily's Paul Carr questioned Greenwald's hesitation in publishing the story. — Finally, the Lab's Caroline O'Donovan talked to Cornell's Tarleton Gillespie about a wide range of issues surrounding algorithms, including personalized news, news judgment, and clickbait.
In yesterday's paper, The Dallas Morning News announced it was ending its experiment with a "premium" site. We wrote about it back in October, when it launched. (The premium strategy replaced a more traditional paywall, albeit one that had hard categories of free vs. paid stories, not the metered approach most American dailies have taken. And to get my disclosure out of the way, I worked at the Morning News for eight years and root for it still.) The idea was that, rather than shut a lot of good content off from the free web, maybe you could increase digital revenue by creating a "premium" experience — a nicer look, getting rid of ads — and charging people 12 bucks a month to access it. As the DMN story notes, the premium experience was also "launched with promises of personalization and loyalty programs to come later," which never really materialized. I appreciate the Morning News' willingness to stray from the newspaper norm in seeking revenue. It was an early leader in wringing more revenue out of its most loyal print subscribers; it's tried out multiple approaches to a free targeted daily product; that paywall strategy went against the grain. But you could see this result coming a Texas mile away. The premium site was not some beautiful, immersive experience — it was aggressively ugly and a pain to navigate. I found it actively worse than the non-premium site, and far from good enough an offering to drive payment. From last fall:
All Dallas Morning News articles: free! All articles laid out onto rectangles with photo backgrounds: $143/year http://t.co/U5ZZWk3c4e -- Joshua Benton (@jbenton) September 30, 2013
@saila Just frustrating. It’s my old paper! I love it! And this seems dumb. Hope I’m wrong! -- Joshua Benton (@jbenton) September 30, 2013
My rough early estimate for how many people will pay for the DMN’s “premium” visual version of the same stories is 0 http://t.co/U5ZZWk3c4e -- Joshua Benton (@jbenton) September 30, 2013
@abeaujon Bottled water offers convenience and portability. Those are real things! This is rectangles. -- Joshua Benton (@jbenton) September 30, 2013
@abeaujon I am totally on board with the idea that dropping the paywall could be smart. Not on board with what the premium here offers. -- Joshua Benton (@jbenton) September 30, 2013For some more background and details, see this blog post from D Magazine, which features a good insider-juice-laden comment from Dallas journalist Eric Celeste, which I'll just copy-and-paste here:
• It's hard to know what lessons were learned by The News because so much of what went wrong here was a result of disorganization instead of strategy. The central question the premium site tried to answer -- would people money for a better web experience (what they internally called a "velvet-rope experience") -- was never answered because that experience never materialized. This was partly due to the suicidal timeline the project employed (which caused all other digital projects current and future to be neglected) but also because some elements were never rolled out. The experiment was supposed to have three components (what Dyer would often call "three legs of the stool"): 1) a better looking site; 2) one with little-to-no ads; 3) one that offered significant subscriber perks. The third part -- which was Dyer's responsibility -- never really happened. _[Id argue the first never happened either. Dyer here is chief marketing officer Jason Dyer. —Josh]_ They imagined offering Christmas card photos taken for you by Pulitzer-winning photogs, or game-watching parities with beat writers. They ended up offering T-shirts. That was part of the problem. The other: • The marketing/sales folks who were effing this cat never got newsroom buy-in. Top newsroom folks were against the premium site from Day 1. Once the premium site went live and starting siphoning traffic (not much, but some) from the basic site, the newsroom freaked. Understandable, since you were diluting the newsroom's only real measure of success. And even if you think big gray corporate newsrooms need disruption, you're not going to convince them when your efforts fail spectacularly. The number of non-subscribers who actually came to the premium site, looked around, and said, "I'll pay for this" was "a fingers-and-toes" number, I was told today. • The News is not thinking right now about how to squeeze more money out of subscribers. It's just trying to find a way to reach a mobile audience so it can THEN figure out how to then monetize it. The mobile efforts to which Dyer refers is just a mobile version of the premium site -- I know, I know, at least this time everyone will get it for free. But there is a comprehensive, integrated (advertising/newsroom/marketing/subscription) strategy being put in place for a mobile-first platform that should start rolling out this fall and continue for a few years. It's another valiant effort by the DMN to be nimble, to figure this new-media landscape out before it kills them. But first … • They have to do what Dyer wrongly says they've done: Take valuable lessons from their failures. The DMN learned NOTHING from this it didn't already know. The paper learned it with its paywall, and its tablet app, and when it tried to charge for high-school scores: People won't pay for content that is ubiquitous, and the newsroom will (perhaps rightly) sabotage any effort that doesn't get its reporters the biggest audience possible.
The Georgia state legislature passed 322 bills during its 2014 legislative session that ended in March. From a bill that eased restrictions on guns in churches, bars, and schools to one that would require some food-stamp recipients to take drug tests, much of the legislation that passed will directly impact the lives of Georgia's 10 million residents. Despite the importance of what transpired at the statehouse, there were only 17 full-time reporters covering the legislature, according to a study released this morning by the Pew Research Center examining changes among the statehouse press corps across the United States.
Though Georgia was one of the few states to actually see an increase in statehouse reporters in recent years, the total number of newspaper statehouse reporters fell 35 percent between 2003 and 2014, outpacing the 30 percent decline in overall newspaper newsroom staffing over that time, the study finds. And as statehouse coverage from traditional news outlets has receded, there's been an influx of for-profit and nonprofit digital startups, ideologically-focused organizations, and pricy subscriber services to try and fill the void. There have been partnerships among legacy news organizations to cover the state governments — and even efforts from legislatures themselves to distribute news and information. But these moves haven't been enough to fill the void, said Amy Mitchell, Pew's director of journalism research and one of the paper's authors. Despite the positive impact of having reporting reach more people through alternative methods, there still is a need for more individuals actually reporting the news, she said. "It also carries with it the potential downside of fewer perspectives — less really localized perspective — when it's coming to looking at what's happening and how this impacts my area of the state, and fewer bodies just watching what's happening and raising their hands to ask questions day-in-and-day-out of what the legislative leaders are doing at the state level," Mitchell said. The study identified 1,592 reporters covering the 50 U.S. statehouses. Of those, 47 percent cover the statehouse full-time. Even with newspapers' decline, 38 percent of all statehouse reporters still work for newspapers, the most of any type of media. Television reporters make up the next most at 17 percent of all reporters, followed by what the report calls nontraditional outlets, with 16 percent. Miami Herald and the Tampa Bay Times, longtime competitors for stories, merged their Tallahassee bureaus in 2008. The eight largest newspapers in Ohio also share stories on state government. “I think the absence of competition affects what our editors demand of us,” Miami Herald Tallahassee bureau chief Mary Ellen Klas told the researchers. “Stories that in the past they would want us to be all over, it’s a little bit harder for us to make that sell now.” Nontraditional organizations have the largest full-time reportorial presence in seven states — Connecticut, Michigan, New York, Ohio, Tennessee, Texas, and Vermont, the report said. That includes the nonprofit Texas Tribune, which has the largest capital presence of any outlet in the country with 15 reporters covering the statehouse full-time. In New York, Capital New York has the largest statehouse bureau, with five full-time staffers in Albany. The New York Times and Albany Times Union each have three statehouse reporters; the 24 other news organizations that cover New York state government each have two or fewer full-time reporters on the beat. "My fear is that as journalism recedes down [and] fewer people are there on a full-time basis, that's just going to set the stage for scandal if nobody's watching," Brian Howey, publisher of Howey Politics Indiana, told the researchers. "I think we're heading into a very dangerous territory in state government." The Pew study classifies nontraditional outlets into four different categories: nonprofit like The Texas Tribune; commercial digital natives like Capital New York; ideologically driven publications; and government insider outlets that charge steep subscription fees and are aimed at people who want deep, granular coverage of the legislatures. Also taking on a larger role covering state government: college students, either through internships with legacy news outlets, university or student publications, or even programs through their journalism schools. 14 percent of the statehouse press corps is made up of college students, the study found. This work can provide valuable training for aspiring journalists, and some journalism schools have recognized this by setting up programs to let student reporters get credit for reporting from the capitol. Print and broadcast students from the University of Montana, for example, receive a scholarship and credit to report from the Montana capitol, and their work is distributed by local outlets throughout the state that do not have their own state government reporters. “There’s a real impact there that students are having on the coverage," Mitchell said.
Photo of President Jimmy Carter addressing the Georgia state legislature from the National Archives and Records Administration.
No, Facebook and Google aren't practicing mind control on us, are they? That's just silly. Their business is the highly prosaic selling of advertising, less romantic than Mad Men, more lucrative than Midas. Mind control is just a side pursuit, one of those many auxiliary products in eternal beta, that might turn into something big. But mind control is on our minds. The two companies — which now control 49 percent of the $50 billion U.S. digital ad market and about 68 percent of the fastest-growing ad market, the $32 billion global mobile ad sector — both play mind games with their customers. How well do they do it? We don't yet know. In the biggest case, Facebook COO Sheryl Sandberg could offer only the lamest of apologies — "This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologize. We never meant to upset you" — in trying to explain how Facebook had played mind games on 700,000 of its users in January 2012. The company disproportionately displaying positive or negative statuses for one week in its News Feed. Is this cluelessness, or just a symptom of unbridled, lean-back-and-play-with-them arrogance? Or both? (Excellent, thoughtful Jay Rosen parsing of Facebook's "having all the power" and its implications, in Atlantic, here.) Yesterday, we learned, via Aarti Shahani of NPR, that Google's "social newsroom" rejected going negative in its social selection of content about Brasil's crushing World Cup loss to Germany Tuesday. "In Google Newsroom, Brazil Defeat Is Not A Headline," reads the NPR story. It goes on to explain that a collection of "data scientists, translators, cultural experts and copywriters turn search results on the World Cup into viral factoids." Well, not really the facts of the results, but a version of them meant to please readers and juice traffic. Factoid, rather than fact, perhaps. More in the pursuit of truthiness than truth. Let's be clear and fair: A Google news search does pull up "crushing," "stunning," "ruthless," and more in way of adjectives that describe the, uh, reality of the game. This Google unit, though, chose its own version of reality to present as fact(oid). Or didn't, as Shahani reports:
After the dramatic defeat by Germany, the team also makes a revealing choice to _not_ publish a single trend on Brazilian search terms. Copywriter Tessa Hewson says they're just too negative. "We might try and wait until we can do a slightly more upbeat trend." That puzzles me. Google has powerful data to see exactly what the audience wants, and produce news-on-demand. The entire world was searching for Neymar — Brazil's superstar player who sat out after fracturing a vertebra. Google could have looked for related search terms, and created content for people to grieve or laugh. I ask the team why they wouldn't use a negative headline. Many headlines are negative. "We're also quite keen not to rub salt into the wounds," producer Sam Clohesy says, "and a negative story about Brazil won't necessarily get a lot of traction in social."Take a look at the picture of the four data scientists pictured in the NPR story. Younger, earnest, undoubtedly brilliant. They are sitting in what Google calls a "newsroom." It's not the data scientists' fault. Some of my best friends are data scientists — but they don't pretend to be journalists. Google, and many of its fellow travelers, hold on to this pretense that they are "doing news," because they publish news, all or almost of all it written by practicing, standards-observed, non-mind-control-seeking journalists. Algorithms don't make a newsroom. They may make a news _product_ — some of which are highly useful to all of us — but they don't operate newsrooms. Journalists do that. The best ones do that without fear or favor, and they certainly don't do it with happy-face factory guidelines. Call it feeding the happy social network, if you'd like — which was Google's justification for eschewing truthful headlines about Brazil's humiliation — but don't call it news or a News Feed, and please don't say it came out of a "newsroom." Words have meaning. Google and Facebook provide many services to us that we now consider essential, almost irreplaceable. Yet they seem to have no boundaries. Business sector boundaries are a blur, as digital eats everything, but more troubling are their ethical boundaries. How can companies that seem to offer so much good — _for free_ — do bad things? Ironically, for companies so interested in knowing how their customers think moment by moment (so they can monetize that thinking), they are sometimes thoughtless about their own actions. One hundred years ago, the trust busters saw that too big is too bad for society and took steps to staunch runaway market dominance, steps that benefitted Americans for many decades. Today's bigness _seems_ so different than that of the early 1900s. Google, Facebook, and Comcast don't seem to be in the same league as Standard Oil, American Tobacco, and the Northern Securities Company. Despite all the hardware they own, they seem to traffic largely in pixels and invisible packets. Outmoded antitrust laws and the accompanying regulatory apparatus, (FTC, FCC, and DOJ in the U.S., several E.U. entities in Europe) can't keep pace, trying to apply old, sensible law to new sense-rattling innovation. Square peg, round hole; try it a thousand times, it won't work. It's not a fair contest; regulators are in a muddle (the Comcast/Time Warner Cable case is today's best example), and meanwhile unbridled market power multiplies. That market power is a big concern, but the two recent mind games that have surfaced (are there more?) raise greater questions. As odious as the NSA's spying on Americans (and everyone else) has been, the potential implications of mood control strategies could be far larger. Sensory manipulation is no longer sci-fi; Aldous Huxley's soma is going digital. What was the Facebook experiment on us about: gauging the power of "emotional contagion through social networks." Imagine the uproar if Fox News or MSNBC had done that, or politicians. I can almost hear the Facebook and Google replies before the question is asked: Who asked you to skew your mass-reaching content to produce cheerfulness? The _people_, as expressed through the social hive, did, they'd say. Google and Facebook as servers — and pushers. If "Facebook intentionally made thousands upon thousands of people sad," as Slate's Katy Waldman succintly put it in a smart column, some of us are just collateral damage. Our wondrous digital hive is alive and growing exponentially. That's largely a good thing, maximizing the reach of our too-small brains. It's not the hive that's at issue here. It's the big, monopolistic beekeepers who should give us pause. It's the ad business that should be fair game for Google and Facebook. It's enough of a challenge for everyone else, fair or not, if they just mind that.
Photo by Maigh used under a Creative Commons license.
That's according to a memo sent to staffers after a dustup involving a tweet from an NPR account. The memo, first reported yesterday by Romenesko:
“If you wouldn’t say it on the air, don’t say it on the Web.” That’s been the basic guidance for quite a few years. In reality, Twitter and other social media sites allow us to show more of our personalities than we might on the air or in a blog post. BUT, though the words may be on “personal” Twitter or Facebook accounts, what we say can reflect on NPR and raise questions about our ability to be objective. [...] Also, despite what many say, retweets should be viewed AS endorsements. Again, from the handbook: “Tweet and retweet as if what you’re saying or passing along is information that you would put on the air or in a ‘traditional’ NPR.org news story. If it needs context, attribution, clarification or ‘knocking down,’ provide it.”At one level, this is nonsense. People use their personal accounts for a whole variety of things. The universe of news-related things is only a subset of that — for some, a small subset. A happy note about your kid's first steps would never be worth putting on air, but it's totally fine for Twitter; a job is only part of a life. To say that "we don’t behave any differently [on personal social media] than we would in any public setting or on an NPR broadcast" is just silliness. At another level, NPR has done pretty well operating on the assumption that its audience is intelligent enough that it doesn't need to be talked down to. The idea that retweeting a politician's comment is somehow an endorsement of that comment assumes that your followers are idiots. They're not. At still another level, is NPR aware that Andy Carvin was an employee there for some time?
Looks like they just backpedaled their long-held policy: NPR Argues RTs by Its Reporters Are Indeed Endorsements http://t.co/KIEUcZYYuc -- Andy Carvin (@acarvin) July 9, 2014
Did you know that wearing a helmet when riding a bike may be bad for you? Or that it's possible to infer the rise of kidnappings in Nigeria from news reports? Or that we can predict the year when a majority of Americans will reject the death penalty (hint: 2044)? Or that it's possible to see that healthcare prices in the U.S. are "insane" in 15 simple charts? Or that the 2015 El Niño event may increase the percentage of Americans who accept climate change as a reality? If you answered "yes" to at least 60 percent of the questions above, I'm 95 percent certain that you've been following the recent buzz around "data" and "explanatory" journalism. I'm talking about websites like Nate Silver's FiveThirtyEight and Ezra Klein's Vox.com. A few traditional news organizations have followed suit, either creating their own in-house operations (The New York Times' The Upshot) or strengthening existing efforts. There is a lot to praise in what all those ventures — and others that will appear in the future — are trying to achieve. Journalists are known for being allergic to math and to the scientific method; some even proudly boast about it. Many in our profession still stick to flawed practices, such as asking the same questions to two or more sources and then just reporting their answers, without weighing the evidence and then pointing out which opinion is better grounded. At first, the current popularity of the new wave of data journalism seemed to be a good antidote to the epidemic of hardball punditry and tomfriedmanism that has plagued the news for ages. When Silver published FiveThirtyEight's foundational manifesto, "What the Fox Knows," I applauded him with enthusiasm. After all, bad data is pervasive in traditional newsrooms. If you think I'm exaggerating, read this recent and infuriating Washington Post op-ed, which gets causality wrong, is oblivious of ecological fallacies, misinterprets sources, and ends with a coarse, insulting, and condescending line. Don't blame just the authors. Blame the editors at the Post, too. But I have to confess my disappointment with the new wave of data journalism — at least for now. All the questions in the first paragraph are malarkey. Those stories may not be representative of everything that FiveThirtyEight, Vox, or The Upshot are publishing — I haven't gathered a proper sample — but they suggest that, when you pay close attention at what they do, it's possible to notice worrying cracks that may undermine their own core principles. So what's wrong with the stories in the first paragraph? First, the piece on bike helmets is an example of cherry-picking and carelessly connecting studies to support an idea. It's possible to "prove" almost anything if you act like this. I can "prove" that vaccines cause autism — they don't — just by selecting certain papers, particularly those based on tiny samples or simple correlations, while ignoring the crushing majority that refutes my intuitions. Gladwellism— deriving grand theories from a handful of undersubstantiated studies — may be popular nowadays, but it's still dubious journalism. Second, using news reports on kidnappings in Nigeria as a proxy variable of actual kidnappings is risky. Proxy variables need to be handled with care. You cannot assert that there _are_ more kidnappings just because the media is running more stories about them. It might be that you're seeing more stories simply because news publications are increasingly interested in this beat. You won't know until you do some proper analysis. Third, long-term linear predictions of nonlinear phenomena are nearly always wrong. Writing that a majority of Americans will reject the death penalty by 2044 "if the trend continues" yields a catchy, SEO-friendly headline, but it means very little. The article says that there's no "reason to expect the trend to pick up speed anytime soon," but the same could have been said about support for gay rights a decade ago, and see where we are today. Unknown unknowns, confounding variables, and black swans can kill any simplistic prediction. As xkcd explained, you cannot predict that a woman will have four dozen husbands next month just because she was unmarried yesterday (zero husbands) and married today (one husband.) That's not just bad Math. It's lack of common sense. Fourth, will the 2015 El Niño event make more Americans accept that climate change is real? No idea. It could be. Or not. It's impossible to know, as that piece by The Upshot reads like wishful thinking. It does say that "belief in warming jumps when global temperatures hit record highs; it drops in cooler years," but the evidence to support that claim is not fully revealed, so we don't know if those "jumps" are relevant, significant, or just pure noise. Why should I trust the writer? I'm a journalist myself. I don't trust journalists. Fifth, can we really assert that health care prices in the U.S. are "insane" based on 15 simple charts? I won't bore you this time, as there's a lot of fishy details in those graphics. If you're interested in the nitty gritty, read this blog post. I bet you'll be as shocked as I was.
Is data journalism in crisis (already)? The main challenge that FiveThirtyEight and Vox (and, to a lesser extent, The Upshot) face is that they overpromised before they were launched but underdelivered after they went public. They promised journalism based on a rigorous pondering of facts and data, but they have offered some stories based on flimsy evidence — with a consequent pushback from readers. As I'm one of those readers, I will take the liberty to offer some suggestions: 1. DATA AND EXPLANATORY JOURNALISM CANNOT BE DONE ON THE CHEAP. It's hard to produce a constant stream of good data journalism on the cheap and with a small team. If history can be considered as a guidance, we should remember that this is not how data journalism was done in the past. Journalism that takes advantage of quantitative methods and visualization is not new. As the Columbia Journalism Review wrote a while ago, "[Nate] Silver's work is arguably less revolution than evolution, one facet of a journalistic practice that has actually been around for decades." True. In 1973, Philip Meyer, now a retired professor at the University of North Carolina, coined the term "precision journalism" in a book with that same title. Meyer is the most popular (but not the only) advocate for a methodical application of social science to the practice of journalism, and one of the founding parents of computer-assisted reporting. To give you an example of the power of this kind of journalism: In 1993, The Miami Herald won a Pulitzer for an investigation about why Hurricane Andrew caused such a sweeping destruction in certain neighborhoods of Miami and Homestead, while leaving others almost intact. It was related to lax zoning inspection and building quality standards. The investigation was based on databases and mapping, but also on careful on-the-ground reporting. Data-savvy investigative reporters and visual designers at established news organizations haven't historically worked in complete isolation. They can rely on relatively large infrastructures to provide funding, support, and legal aid, when needed. It remains to be seen if a publication that conducts just this type of journalism can survive on its own. This leads to my next point. 2. DATA AND EXPLANATORY JOURNALISM CANNOT BE PRODUCED IN A RUSH. Both Vox and FiveThirtyEight are publishing new stories, blog posts, and explanatory pieces nonstop. As any news organization nowadays, they live in a 24/7 world. They need to feed the goat, as their business models seem to be based on attracting adequately large audiences. There's a big risk in that. It's much easier to feed the goat with stories with titles like "The New York Times Editorial Shakeup As Explained By 'Game Of Thrones'" than with thorough analyses of population trends in Ukraine. (I'll admit I found both quite enticing, though.) It is tempting for a news startup to try to be both BuzzFeed and The Economist at the same time, no matter how chimerical that goal is. Lighthearted blahblah can be done quickly and nonchalantly. Proper analytical journalism can't. If you have a small organization, you may have to choose between producing a lot of bad stuff or publishing just a small amount of excellent stories. 3. PART OF YOUR AUDIENCE KNOWS MORE THAN YOU DO. Here's the description for a perfect storm: Data journalism is often based on publicly available databases. Besides, current standards of journalism transparency dictate that, after a journalist writes or visualizes stories based on data, she should disclose her sources and make her spreadsheets downloadable for anyone to check. Finally, due to the very nature of those stories, a good portion of the audience will likely be quite numerate. This helps explain some recent takedowns in social media. Plenty of readers out there know much more than we do about our own data, and nowadays they have the means to broadcast their outrage. They don't hesitate to do it. 4. DATA JOURNALISTS CANNOT SURVIVE IN A COCOON. Silver likes to quote Archilochus' phrase about foxes and hedgehogs: "The fox knows many things, but the hedgehog knows one big thing." Hedgehog intellectuals see the world through the lens of expertise on a single area, or of a single grand idea; fox thinkers borrow tools from many fields. Journalists tend to be foxes: We have a basic understanding of different disciplines, but we don't necessarily specialize in any of them. This is good on one hand, as it allows for richer and more lively reporting. But it also poses problems for data journalists, as you can't really extract meaning from data using only cookie-cutter templates. No matter how great you are at analyzing stuff with the R statistics language, you'll be in trouble if you don't have a deep understanding of where the data came from, of how they were gathered, filtered, and processed, of their strengths and shortcomings. That's the reason why, in many universities, science departments teach their own statistics courses: It's not the same to use stats for sociological observations as for genetics, psychology, astronomy, or physics. The equations may be similar, but the outcomes of your analyses don't depend only on those equations. Context matters. Foxes need to partner up with hedgehogs, journalists with specialists. And it's not just a matter of asking a couple of researchers some questions while you write a blog post — something that, by the way, neither Vox or FiveThirtyEight do often enough (therefore the lack of quoted experts in many of their stories.) It's also a matter of doing your reporting in collaboration with those researchers, as they're the ones that know the data really well. This isn't a particularly groundbreaking notion: The publications who regularly conduct solid data and investigative journalism nowadays, like ProPublica, work this way on a regular basis. Their journalists are very conscious of the limits of their knowledge. (Side note: It is dubious that Nate Silver himself is a pure fox. His biggest successes when he was still at The New York Times depended on his familiarity with sports and election data. When it comes to those topics, he's a werehedgehog, so to speak.)
Most novel technological concepts or tools follow a Gartner Hype Curve: First, the novelty is released and enjoys a peak of inflated expectations. Then it suffers through a valley of disillusionment. After that, when the hype vanishes, it reaches maturity and is widely adopted. Even if data journalism is by no means a new phenomenon, it has entered the mainstream quite recently, breezed over the peak of inflated expectations, and precipitously sank into a valley of gloom. Hopefully, it'll soon enter the last phase, one of stability and productivity. There's a need for a journalism which is more rigorous and scientific. Data skills shouldn't be the turf of a small guild of savants — they should permeate journalism in general. Data and explanatory news organizations can help achieve those goals. Therefore, pundits and math-challenged journalists (like the folks at The New Republic, who seem to know as much about stats as I do about quantum mechanics, but feel entitled to write about it anyway) shouldn't feel triumphant after reading this column. First, because Silver's manifesto is still a great read. The news data and explanatory journalism organizations just need to be up to their own standards to thrive — something that I honestly hope will happen soon. Second, because it's a good thing to be reminded of Fred Mosteller's famous assertion: "It is easy to lie with statistics, but easier to lie without them." Or Richard Feynman's: "The first principle is that you must not fool yourself, and you are the easiest person to fool." Indeed. Particularly when you don't know how to weigh qualitative and quantitative evidence.
Should Facebook be allowed to decide what information we do or don't see? Should Google be responsible for ensuring that their search results don't offend or incriminate? If we allow platforms to determine what content and information we encounter, are we defaulting on our civic responsibilities? Lately, it seems questions like these — questions about the algorithms that govern and structure our information networks — are raised more and more frequently. Just last week, people were outraged when it was discovered that Facebook had tried to study the spread of emotion by altering what type of posts 600,000 users saw. But the reality is we know less and less about how news content makes its way to us — especially as control of those information flows becomes more solidified in the hands of technology companies with little incentive to explain their strategies around content. Tarleton Gillespie has done a considerable amount of writing on what we know about these algorithms — and what we think we know about them. Gillespie is an associate profesor of information science at Cornell, currently spending time at the Microsoft Research Center in Cambridge as a visiting researcher. We first met at the MIT Media Lab, where Gillespie gave a talk on "Algorithms, and the Production of Calculated Publics." His writing on the subject includes a paper titled The Relevance of Algorithms and an essay called "Can an algorithm be wrong?" More recently, he contributed to Culture Digitally's #digitalkeywords project with a piece on algorithms that explains, among other things, how it can be misleading to generalize the term. We touched on the conflict between publishers and Facebook, Twitter trends, the personalization backlash, yanking the levers of Reddit's parameters, and how information ecosystems have always required informed decision making, algorithms or no. Here's a lightly edited transcript of our conversation.
CAROLINE O'DONOVAN: What is the #digitalkeywords project, and why did you think "algorithm" was something that was important to define?
TARLETON GILLESPIE: The #digitalkeywords project is a project Ben Peters is organizing. The short of it is that it's inspired by Raymond Williams' "Keywords" collection from 1976. He wanted to get scholars to think about the terminology that matters now, and would be important both to scholarship around new media, and also to broader public audiences. For me, I've been finding myself thinking about the term algorithm quite a bit in the last couple years. In part, it came from a research project that is my main project, which is thinking about how social media platforms govern speech that they find problematic. That could be classic categories like sex and violence, it could be hate speech, it could be an array of things. But the question is, how are they finding themselves in charge of this problem of removing things that are unacceptable — finding themselves as cultural gatekeepers as publishers and broadcasters have had to be — and how do they go about doing it? And how do they justify those techniques?
As I was trying to figure that out, I was thinking about a whole array of things — like putting things behind age barriers, or safe search mechanisms, or blocking by country — and I was calling them algorithmic techniques. That made me think about updating a long literature about how technology matters, how you can design things so as to govern use — but how do you do that in an algorithmic sense, rather than, say, in a mechanical sense?
O'DONOVAN: You mention the way that social media platforms have, in a way that they maybe didn't even expect to, taken over a role that used to be strictly in the domain of publishers. One thing that stuck out to me in some of what you've written is the question: Is having to make arbitrary decisions about that _seem_ objective an entirely new problem? You argue that it's not a new problem, it's just a new way of dealing with it — I think you call it a new information logic.
GILLESPIE: It's in some ways a very old problem. NBC has to decide what's acceptable at 8 p.m. And they do that within some guidance of what the FCC says, but mostly they're working within those barriers, and deciding what they think their audience will accept, what they think their moral compass is, what their advertisers will blanch at.
Now you've got Facebook and Apple and YouTube being in a similar situation, but I think the game is different. You can do classic stuff, like setting rules and deciding where the line is, but the other thing you can do now is you can design the platform so that you manage where those things go. It's not like there's no precedent for that either — you can build a video rental store and put all the porn in the back room, and have rules about how gets in there and who doesn't. That's using the material architecture and the rules to keep the right stuff in front of the right people and vice versa. But you can do that with such sophistication when you think about how to design a social media platform. You know much more about people, you know much more about what their preferences are, about where they're coming from, about the content. You can use the algorithm — the very same algorithm that's designed to deliver what you searched for — to also keep way what they think you don't want to see.
O'DONOVAN: That's an interesting point, too — what they "think" you don't want to see. You've written that the way Google knows if their search algorithm is working right is if the first five results have a lot of clickthrough. But when it comes to a more editorial judgment, how do you measure satisfaction? And is satisfaction even what we really want news consumers to experience?
GILLESPIE: That gets to this bigger question: What are algorithms a stand in for? We hear all the time about how powerful Google's algorithm is. It sounds very precise, it sounds very mathematical. But in reality, algorithms can be trained to look for a lot of things, and they are trained on a set of user practices, they're trained on a set of criteria that are decided by the platform operators, and what counts as relevant is just as vague as what counts as newsworthy. It's not a meaningless word, but it's a word that has a lot of room for interpretation, and a lot of room to have certain kinds of assumptions and categories built in but be relatively invisible.
O'DONOVAN: What do you feel like the future of personalized news content is? Did we maybe go a little too bold at first with our belief that that was possible, or that it would be satisfying, or that it was something that people or publishers really wanted?
GILLESPIE: I would love to think that the pendulum is going to swing back. A lot of this is driven by what a particular platform thinks it can do. There's a lot of push toward trying to predict what a user might want, well outside of journalism. Search results and advertising and news fell into that. Journalism is the place where it got pushed back the most, because we have a public interest in not just encountering what we expect to see, that's much stronger than the same feeling about advertising, right? I don't like predicting the future, because I'm terrible at it, but the enthusiasm about personalization has shifted a bit. In some ways, I feel like what we're seeing now is throwing every possible kind of slice at us. Any slice — here's what we think you want, here's what we think you asked for, here's what we think about what everyone else is doing, here's what your circle of friends is doing, here's what's timely, here's what's editorially interesting, here's what's connected to our advertising — that in some ways giving us many slices through the available content, personalization just being one or maybe a couple of those. I don't think it's going to go away. I think it remains one of the ways in which a platform can slice up what it has and try to offer it up. I do think the gloss has gone off it a little bit, and probably for a good reason.
O'DONOVAN: You write about the feeling of being satisfied by personalization. For me, occasionally a Pandora station I make will deliver what I didn't know I wanted. But there's a tension between that feeling of satisfaction when an algorithm performs the way you wanted it to, and this other feeling that we don't want to be quantified — we don't like the idea of this cold, objective robot making these decisions for us. How do you think about which of those two reaction wins out?
GILLESPIE: I think in some ways I worry that those two sides miss the real third way. We do have that experience where an algorithm seems to work really well. Satisfaction is one way to understand that, that feeling of accomplishment — it was exactly what I needed, it was quick, it was effective. And the opposite, what you were saying about the kind of coldness of it. I think what that hides is the way that, for a long time, we have navigated information in part through mechanisms that don't belong to us that try very had to provide something for us. They weren't always calculational: Music reviewers are an intensely important way that we decide to encounter things that's both appealing to us and can work really well. When someone suggests something that we never would have heard, it's completely moving to us, and exciting. And we can be frustrated by it: These people are cultural gatekeepers. Are they attuned to what we really care about? Are they culturally biased? Elitist? We struggle with that. Similarly, when we deal with the quantified versions — the pop charts — is that an amazing glimpse of what people really are interested in? Or is it a weird artifact of what weird middle ground material can make it above all the more interesting stuff?
We're always navigating information and culture by way of these mechanisms, and every mechanism has a built in notion of what it's trying to accomplish. That's the part we need to unpack. There's always going to be a tool that says if you're interested in THIS, listen to THIS. But the assumptions that tool makes about what it should look for, what it is we seek, and what's important about that form of culture — whether it's journalism or music or whatever — that's the part we have to unpack.
O'DONOVAN: Do you think that some of that confusion, or obfuscation, could be reduced by these companies labeling things more clearly? We think Trending is a specific thing, but maybe Twitter could use a different word that is more specific, or accurate. We think we know what a News Feed is, but it doesn't look or feel like we think the "news" should be. I'm not necessarily talking about more transparency in terms of what actually goes in to the rules — but if they did a better job at explaining what it is they're trying to do, besides just saying, We're serving the content that you like, or, We're serving the content that's popular right now.
GILLESPIE: I think that would be really terrific. There are obviously two obstacles for them in being much clearer than they are. One is, obviously, these aren't just services — they are commodities and advertisements for themselves, so they have to be catchy. The bigger one is not wanting to reveal too much of their secret sauce. They feel like these algorithms need to be guarded. They don't want them to be gamed by search engine optimization or trend optimization or whoever it would be. They don't want competitors to be able to just lift that and build their replacement. But it seems like there would be room for a kind of explanatory clarity that's not the same as giving away exactly how the algorithm works in specific terms, but that honors the fact that these different glimpses are different, and they differ in kind, not just sort of on the surface. Twitter has been relatively forthcoming about what Trends is, maybe as best as it could be. But I like a model like Reddit. When you go to the top of Reddit, there's about five choices of how to organize — there's trending, there's hot, there's controversial — and you can read about what those things are. It's not that that's the perfect way to do it, but at least the fact that there are different slices reminds you that these slices are potentially different. I think that they could have a little justification about how they thought about it. One of the things I found really interesting about Twitter Trends is that they'll weight tweets or hashtags that appear across different clusters of people that aren't connected to each other on Twitter higher than a lot of activity that happens in a densely connected cluster of people. You can imagine the opposite choice, where something that happens in a cluster of people really intensely, but isn't escaping — maybe that's exactly what should be revealed. Something like: You may not know anything about this, but somewhere, there's a lot of discussion about this, and you may want to know what that is. That's fulfills a very different public or journalistic thing. Yes, there are things that seem to be talked about on a wide basis, and we want to reflect those back, but we also want to say, over here in the world, in a place you don't have access to, there's something going on. Even if you just had those two things next to each other and talked a bit about how they're different, you'd offer users a way to think about their difference and make choices. And it would push the platform to think about why the difference might matter.
O'DONOVAN: It's interesting to think that a Facebook gets you to buy in for the idea of a social network, but now is their main product that algorithm? And if it is, which of the platforms is going to be the first to offer, like you're saying, these different slices — a series of different algorithms? Instead of just offering one as a product, what they can boast is: We have fifteen different ways to search an unlimited corpus of data.
GILLESPIE: I would think that Twitter would be in a more likely position than Facebook, only because in their sense of themselves they seem to have an emphasis on the public role they play — the democratic role they play. It's not surprising that Reddit is a little farther ahead, because it's much more fascinated with the technical element of this, and it wears the technical element on its sleeve. Now, it's not clear that Trending and Controversial and Hot and New are the right four slices, or do the work that I'm hoping it would do to reveal how these things are different. In some ways, the other way you do this — and it starts to sound like personalization, but I don't mean it this way — is to let the user play with the parameters of those differences. I don't mean, Boy, I hope I could set it so I can get all domestic and no international — that's the worst problem of personalization. But, show me Hot and show me Trending and show me Controversial, and then let me pull the levers and change the parameters a little bit, and see what that does to the ranking. That recognition that even one algorithm's criteria shift based on the parameters, seeing that happen — not just knowing, intellectually, but seeing it happen — would be a pretty interesting glimpse into how much the choices are built into the apparatus.
O'DONOVAN: Talk about data literacy! The idea of being able to one day pull all those levers yourself — that any person's understanding of how they get information involves tweaking all these things — that's an interesting view of the future.
GILLESPIE: Yeah, well, to me it's also a glimpse of the past. If we think about traditional media, understanding the levers that were at work there, about the choices that people made to decide what was on the front page of the paper, or what was going to be broadcast this season — were those criteria being driven by certain kinds of sources? What were the assumptions being made there? Those things were very far from where a regular viewer or reader could even access. And they were only glimpse-able in the moments either when there was a crisis of confidence — a newspaper blew it and had a big embarrassment and you could see the inner workings — or when you had a clever sociologist who could get in there and talk about how it works. Now, at least, you can see the parameters could be more visible, if that were a feature that could be provided.
O'DONOVAN: I really liked that point — that in some ways an editorial meeting is more clouded for the typical reader than an algorithm is, because there's more chance for fallibility, so there's more incentive to make it hard for someone to understand how those decisions are made.
GILLESPIE: There's definitely a lot of obscurity to be thought about. I wouldn't want to paint it where the editorial decisions of a newspaper are totally obscured and the algorithm could be totally clear. I think the researchers that have written about news values and the calculations of editors that were about newsworthiness and timeliness and what an audience cares about — that was a kind of algorithm. There were certain things you would look at differently, and it would produce different outcomes. And you can only make that so visible, right?
O'DONOVAN: What's interesting to me is that The New York Times maintains its great reputation even though it's being beat out for traffic — there was a part of the innovation report that said sites like The Huffington Post are just repackaging Times content and getting more traffic from it than they are. The Times gets to say, We do great journalism. Then, on the other side of the coin you have a BuzzFeed, which is about as close as you can come to gaming a social algorithm, I think, but their reputation is bad in a lot of circles. Still, they're reaching so many more people that way. I don't know if there's a way to have both, but it seems like there should be a middle ground.
GILLESPIE: I don't know how you achieve that middle ground. I think we're in a place where it's awfully easy to recirculate stories that are produced. It reminds us what an incredible mechanism it was to say, We're going to be a newspaper that not only reports the story, writes the story, checks the story, produces the story, but then also manages turning it into paper, delivering it to street corners, having people sell it, managing subscriptions — that's an incredible apparatus. We got used to that as a 20th-century arrangement, whether it was in newspapers or film or in television. But now that whole second half of — "We will also manage the circulation of this content" — is fractured enough that it's just much harder to put the financial and emotional investment in the first half.
O'DONOVAN: I'm thinking of this point you make about how algorithms can change very rapidly — in fact, they're always changing, they're always learning, and they're never exactly the same thing to any two people, and they're never the same thing one day after the other. Then you have BuzzFeed, and what they're doing with their data input and analytics, which as I said is about as close to gaming a social algorithm as you can get. How is BuzzFeed doing that? And what happens when there's this mirrored back and forth? Let's say, for example, Facebook decides they want to downplay clickbait headlines. Theoretically, according to what BuzzFeed says about itself, they're going to notice that. They're going to notice that that trick is no longer working, and they're going to come up with a new trick, and then the algorithm would have to change in reaction to that. Is that a logical characterization of that feedback loop? And is there any way to change it?
GILLESPIE: I think it is, as best as I know. In some ways, BuzzFeed is a creature of the kind of algorithmic delivery of information. It's not so far from search engine optimization. It's put a lot of investment into watching the circulation of its stories, trying to figure out what gets circulated, and then tweaking it. If Facebook changes its algorithm, it hopes it will discern that and come up with a different theory. The other way to think of it is, they've got two forces to factor in. They've got to figure out Facebook's algorithm, but they've also got to figure out the audience. For their stuff to drop off — let's say they see a lag in the previous month — is that because Facebook tweaked their algorithm? Because people were less interested? Is that because they didn't have as many interesting stories? Because no celebrities did anything embarrassing that month? It's very hard to discern this, and that's something cultural producers have had to do for a long time. Why didn't people come to this movie? Was it that it was terrible? Was it that word of mouth was bad? Was it a bad weekend? Is it the mechanism by which the movie gets to the people, or was it the content? There's a thin line between gaming the algorithm and trying to be appealing. The funny thing about clickbait as an idea is it's basically shorthand for: People really wanted to read this. Writing a really juicy headline to get people to read it, whether you got the substance or not, is not new to BuzzFeed and Upworthy. Is that gaming the algorithm? Was the algorithm of the penny newspaper — "You can see the front page on the shelf, and you can't see the content in it, so those words better be big and gripping and delicious"? Is that gaming the algorithm for how newspapers were sold? Or is that just trying to get people to read your paper? The last part of this is, as BuzzFeed has shown, if you're beholden to the algorithm, what you do is not just sit there and try to guess the algorithm — you go and you meet with Facebook, and you strike a deal. That's the real story — who's going to get to strike the deal with providers, such that their stuff continues to stay on the network, or continues to be privileged.
O'DONOVAN: I don't know if BuzzFeed would admit that that's exactly what the nature of their deal with Facebook is.
GILLESPIE: Right, but they're not stupid. Looking to figure out how to stabilize that relationship is a lot smarter than trying to ride it, and hope that you understand the workings underneath.
O'DONOVAN: Publishers are frustrated because they feel like they're not reaching the number of people that they used to on a platform like Facebook. They say: The algorithm isn't meant to serve us, the algorithm is bad, the algorithm doesn't respect good journalism. But then a guy who works for Facebook ad product had a blog post about the state of media saying all media is garbage these days and why don't we have good journalism anymore. Then there's Alexis Madrigal of The Atlantic saying, among many others, what do you mean? We're doing the best we can here, but we can barely get any play on your platform as it is, and if we just did serious investigative journalism, no one would ever read it, and it's your fault. How did we end up in a situation like that, and what can we do about it?
GILLESPIE: I find myself regularly wanting to go back to: These are not new difficulties. We had this big shift in television news where for a long time, in the Cronkite and Murrow age, you couldn't be a television network without having a a flagship news program that had gravitas and journalistic traditions and all that. We had what some people might point to as the Golden Age of Television News. And then at some point, a number of networks under various kinds of economic pressures said, We can't afford a loss leader anymore. That was one of those moments where the call on one side — We desperately need to have good journalism! — runs up against a distributor, a network in the traditional sense, that says, We have other competing priorities. Now, it plays on slightly different lines. It's not: We have to have a slate of programming that will draw big audience. It's: We have a platform that calculates what people do and then responds to that. That means they can't point to things differently. Instead of saying, we've got to make a buck at the end of the day — which of course they do — they can say, Look, it's a user-driven mega-community. Now, that's a misrepresentation, I think, of the decisions that go into what the algorithm displays in the first place. But it's not so different — the entity that helps deliver the news is not the same as the news, and their interests and their understanding of what they should be doing, and their commercial pressures, and how they came to do what they do, is not the same as having come into a kind of journalistic project from a journalistic standpoint. Now, a solution? That's harder. Do you call on these networks, on Facebook — and this is what Alexis Madrigal and Upworthy are doing — and say, You've got to look closer at the choices you're making about your algorithm, because you are in fact putting us at a deficit and you shouldn't? And when you say shouldn't, shouldn't according to what? According to some public obligation? It's not clear that we expect Facebook to be a public service, even in the way that we expected NBC to be one.
O'DONOVAN: What if we had a news service where you can request a bespoke algorithm? You can tell it what you like to read, what you think you like to read, and it can watch your behavior, and you can tell it how you want to better yourself — you allow it to decide what the parameters of better or smarter are.
GILLESPIE: We've traditionally asked people what they want, and then sometimes given them what we think they should have, but we haven't really said, if you think you should have something, what is it and can we help give it to you. That's kind of cool, I like that.
O'DONOVAN: There's always a problem of — does length make something serious? Well, no, not really. Does a heavily titled byline make something smarter? Well, no. At the end of the day, we want to put our faith in someone else's judgment — and aren't we then just back at homepages?
GILLESPIE: I like the idea because it puts the algorithm in the service of both the public interest and the user, and tries to bring those interests onto the same page. It does bring up the issue of it's very hard to know something about content except what computers know well — things like length and source and date and keywords. But that last point is exactly right — we are in an information environment, and we always have been, where the best possibility of us being informed and thoughtful and ready to be participants in a democracy has always depended on other people. It's always depended on other people to be closest to the information we need, which makes them risky, because they're biased or subjective or emotional, but more importantly because they have to be participants in an institution that has to sustain itself, whether it's a newspaper organization or a TV network or a social media platform. That raises all sorts of problems too, about why they're really bringing the information to us that they are. I don't think we can really get away from that. So the only thing we can do is to continue to demand that these services provide what we think we need, the "we" being both individual and collective, and keep paying attention to the way they have structural problems in doing so. Whether that's algorithm or editorial acumen or yellow journalism, these are just the kinds of problems that emerge when institutions try to produce information on a public and commercial basis across a technical platform. We're just facing the newest version of that.
Image of an algorithm by Manu Escalante used under a Creative Commons license.
Seeking Alpha is a site for people to write up and share their investing ideas — "a platform for investment research, with broad coverage of stocks, asset classes, ETFs and investment strategy." Some of its contributors use pseudonyms, and earlier this year, someone using the _nom de investissement_ of "Pump Terminator" wrote a piece arguing that a company named NanoViricides was wildly overvalued and using sketchy business practices. NanoViricides went to court, demanding that Seeking Alpha turn over the real identity of Pump Terminator so that it could pursue a libel claim against him or her. Seeking Alpha fought it, and now, in what the site is calling a victory for free speech, the New York Supreme Court has denied NanoViricides' demand. (You can read the court's opinion here.) Of interest: The very nature of open crowdsourced platforms — the ruling lumps them together under the rubric of "message boards," though that seems imprecise in 2014 — makes it harder to pursue the sort of claim NanoViricides was trying to make. Quoting an earlier ruling (emphasis mine):
[i]n determining whether a plaintiff's complaint [or pre-action petition] includes a published 'false and defamatory statement concerning another,' commentators have argued that the defamatory import of the communication must be viewed in light of the fact that bulletin boards and chat rooms 'are often the repository of a wide range of casual, emotive, and imprecise speech,' and that the online 'RECIPIENTS OF [OFFENSIVE] STATEMENTS DO NOT NECESSARILY ATTRIBUTE THE SAME LEVEL OF CREDENCE TO THE STATEMENTS [THAT] THEY WOULD ACCORD TO STATEMENTS MADE IN OTHER CONTEXTS.'There's also a freedom-of-expression angle:
Finally, this court finds its holding in line with the First Department's urging in _Sandals_ that courts should protect against "the use of subpoenas by corporations and plaintiffs with business interests to enlist the help of ISPs via court orders to silence their online critics, which threatens to stifle the free exchange of ideas. _Sandals_, 86 A.D.3d at 45. Clearly the article herein at issue does not cast petitioner in a positive light and the court can sympathize with the filing of the instant petition. HOWEVER, IT IS PARAMOUNT IN AN OPEN AND FREE SOCIETY THAT WE PROTECT THE ANONYMITY OF THOSE WHOSE "PUBLICATION IS PROMPTED BY THE DESIRE TO QUESTION, CHALLENGE AND CRITICIZE THE PRACTICES OF THOSE IN POWER WITHOUT INCURRING ADVERSE CONSEQUENCES."Section 230 of the Communications Decency Act (broadly speaking) protects people who run websites from being liable for the actions of their users/commenters. But this ruling goes beyond that, to merely revealing the identity of those users. It's hard to be sure about cause and effect, but NanoViricides' stock spent early February trading between $4.42 and $4.65 a share. On February 11, the date the Seeking Alpha article was published, it dropped to $3.36, a 23 percent drop from the day before, on trading volume 22 times higher than the day before.
You can hear gunshots in the background of this shaky amateur YouTube video. There's black smoke rising out of what looks to be a mosque; the narrator speaks in Arabic; the video's description says the video was shot in the suburbs of Damascus, Syria. The video is compelling, but is it real? If it's real, does this video show a new event? And how can a journalist go about verifying its authenticity? These are questions journalists face everyday as the amount of user-generated content, especially video, continues to proliferate. To help address these issues, Amnesty International today launched a new website, the Citizen Evidence Lab, to provide journalists and human-rights advocates tools and lessons to help them authenticate user-generated video. Amnesty's efforts here are focused primarily on YouTube videos. There's a step-by-step guide that allows users to follow a detailed checklist to try to verify things like the history of whoever uploaded the video, the time it was uploaded, and where the video was shot. Tutorials and exercises cover basic skills like extracting audio from a YouTube video or downloading a YouTube video. Amnesty has also created a test where users can practice their verification skills using the Citizen Evidence Lab checklist.
Verifying user-generated content can be a massive undertaking for news organizations. Storyful has created a whole business, which was bought last year by News Corp, out of verifying social content. In April, announced a partnership with Facebook to run a page, FB Newswire, that shares content that was posted to Facebook. That Damascus mosque video above was used as part of the assessment the Citizen Evidence Lab posted on its site. The step-by-step guide shows users steps to confirm its authenticity. In this case, the video was posted by a media collective that seems to produce videos from that particular area. The collective has posted more than 600 other videos, has its own website, and is active on other social media platforms. Additionally, the weather in the video matches weather reports from the day it was uploaded. And a satellite image of the mosque matches the one that appears in the video. The preponderance of the evidence would seem to support the idea that the video is authentic.
The Wall Street Journal marked its 75th anniversary on July 8, 1964 with a front-page story examining the paper's history and what the paper stood for. Though initially conceived as solely a business newspaper, the Journal wrote at the time that its view is that business news covers "everything that somehow relates to making a living." "Journal coverage of such news is shaped by a belief that to understand or explain business, it’s necessary to look at what people are doing, thinking and feeling about matters that are seemingly remote from 'business,' but that influence it indirectly," the Journal wrote. Because of that expanded editorial approach, as the Journal today marks its 125th anniversary with a special print section and a series of online interactives — including a timeline with more than 300 archival clips — the paper is able to take advantage of its broad and deep archive to mark its quasquicentennial — as well as to use its archives and what it has developed for the anniversary beyond the current celebration. The anniversary package includes a lot of material — Taylor Swift's contribution seems to be getting the early attention — and that includes an array of interactives. The timeline allows users to search by era or themes to examine Journal content from the past 125 years. There are videos embedded in the interactive, and original Journal content is presented on individual cards that expand to show full stories and are also shareable. A game of sorts, scheduled to be published later this week, asks readers place individuals who influenced the world of business — from Steve Jobs to Charles Ponzi — on a grid to measure their influence and innovation. Users can leave comments on why they placed a person where they did and the matrix will also show the averages of where all users put an individual.
The paper has also been running a blog all year long, Today in WSJ History, that's has been updated with archival content that corresponds with dates in history and anniversaries, like the 50th anniversary of the Beatles arriving in America, and also to modern news events. For instance, when the director Harold Ramis died earlier this year, they posted two stories from the archives on the director. The blog will be updated through the end of the year, but the Journal plans to ensure that the interactive presentation of its archival material is active past the anniversary date, Matt Murray, the Journal's deputy editor-in- chief, said. "This is something that will live well beyond the anniversary," Murray said. "We can use it in museums — we can keep adding to it. It's just a much more engaging fun way of getting into our archives than going through microfilm and going page after page."
Surfacing archival content is a natural priority for any news organization with a lifespan spanning centuries. The New York Times' leaked innovation report highlighted improved use of archives as a potential strategic advantage for the Times, and the same idea applies to most newspapers. The Journal is still working out the details on how exactly it will build on the interactives after the anniversary. Multimedia editor Madeline Farbman said that, for example, the matrix game (which she helped create) could be reused whenever one of the figures on it is back in the news. "The people who are on it they remain in the news, so when they're in the news, you resurface this and you give it another chance to be found with whatever the latest story is," she said. On the print side, the Journal is wrapping today's newspaper with a facsimile copy of its first-ever edition that's annotated with additional details and modern information on what the Journal covered on July 8, 1889. (However, if you want to pick up a copy, it'll cost you $2, not two cents like it did in 1889.) They've also produced a mockup of a front page of the 2014 Journal if it covered the news in 1889. (Today in Personal Journal: Goddesses in Bodices: The Latest in Taffeta.) But the heart of the print section, which is also featured in the European and Asian versions of the paper, is a series of essays, called The Future of Everything, examining the future of various industries written by leaders in those fields. Along with Swift, Facebook's Mark Zuckerberg is in there, writing on connectivity. There's also a special opinion section. All of the print content will, obviously, be on the Journal's website as well. While the interactives will be outside the Journal's paywall, only a few of the essays will be free to read for non-subscribers. Still, the interactives have been designed for sharing as each individual story or video on the timeline is shareable, and the Journal has been working with the contributors to encourage them to share what they've written as well. Taylor Swift, she of the 41 million Twitter followers, retweeted a @WSJ tweet of her piece, which helped drive it to nearly 7,000 retweets and favorites at this writing.
Taylor Swift on the future of music for our #WSJ125 anniversary special: http://t.co/XV5M05xby3 pic.twitter.com/nMNJ0g0kL4 -- Wall Street Journal (@WSJ) July 7, 2014(It also drove the kind of Twitter replies one imagines @WSJ rarely gets.) The Journal has used social media in the run-up to the anniversary, asking readers to participate in the celebration by sharing photos of old copies of the Journal that they've saved. And readers can also expect a concerted effort from the Journal to promote its anniversary package.
Which news events have meant the most to you? Send us a photo of the copy of the @WSJ you've saved: http://t.co/eohfPLKC4F #WSJ125 -- Wall Street Journal (@WSJ) July 2, 2014"We're using SEO and [social] and all kinds of other methods to bring in new readers who don't already know who we are," said social media editor Allison Lichter. "This is another extension of that."
THIS WEEK'S ESSENTIAL READS: The key pieces from the past couple of weeks are Sebastian Deterding on the ethics of Facebook's experiment, the Columbia Journalism Review's Michael Meyer on Jeff Bezos' plan for The Washington Post, and Nick Davies' sweeping review of News Corp.'s phone hacking scandal and British tabloid journalism culture._The review has been off the last two weeks, so this week's review covers the past couple of weeks._ FACEBOOK'S ETHICALLY DUBIOUS EXPERIMENT: Facebook was under fire again this week for collecting data from its users without their knowledge, this time in conjunction with Cornell University professors for an experiment on the influence of Facebook's News Feed on its users' emotions. The study, which was published in May, involved skewing what nearly 700,000 users saw for a week in their News Feeds with more positive or negative words and then measuring the positivity and negativity in their own posts. The Atlantic's Robinson Meyer has a good explanation of the procedural and ethical details behind the study: Cornell's institutional review board, which reviews all research the university does involving human subjects, wasn't involved until after the experiment was finished. And as Forbes' Kashmir Hill reported, the statement in Facebook's terms of service that it can use its users' data for research wasn't added until after the study was conducted. It's not clear what review the study did get — in another Hill article, Facebook said it conducted an "internal review" of the study. The Atlantic's Adrienne LaFrance also reported on the misgivings of the study's editor as well as her reasons for approving it. Gigaom's Mathew Ingram put together a good summary of the criticism and defenses of the study's ethics from people within and outside Facebook. British regulators said they're investigating Facebook on the study, and Facebook executive Sheryl Sandberg apologized on the company's behalf — not for the study itself, but for communicating it poorly. One of the study's authors, Facebook data scientist Adam Kramer, defended the study's design while apologizing for "any anxiety it caused" and noting that Facebook's internal review processes have improved since the study was conducted. Numerous writers condemned Facebook's callousness in running the study, including Mike Masnick of Techdirt, James Poniewozik of Time, Jordan Ellenberg of Slate, and Alex Wilhelm of Techcrunch. Wired's Katie Collins argued that the study reminds us that "Facebook as a company trades in information, not people," and both Charles Arthur of The Guardian and David Holmes of PandoDaily warned that the study indicates Facebook's immense power and its willingness to use that power for ignoble ends. Several researchers published defenses of Facebook: The University of Texas' Tal Yarkoni argued that concerns about Facebook manipulating its users' experience are overblown because the News Feed is an entirely artificial environment, the site of constant manipulation. Northeastern's Brian Keegan argued that "every A/B test is a psych experiment." And in a more measured post, Microsoft researcher danah boyd said that too much of the criticism has narrowly focused on Facebook because it provided a concrete point on which to focus their anxiety about big data. Sociologist Zeynep Tufekci pushed back against those defenses, arguing that the concern about manipulation is a legitimate one: "IT IS CLEAR THAT THE POWERFUL HAVE INCREASINGLY MORE WAYS TO ENGINEER THE PUBLIC," SHE WROTE. "THAT, TO ME, IS A SCARIER AND MORE IMPORTANT QUESTION THAN WHETHER OR NOT SUCH RESEARCH GETS PUBLISHED." Design researcher Sebastian Deterding had the most thorough ethical breakdown of the study, explaining the clash of opinions as a collision between understandings of the study as academic research and as social media A/B testing. At The Atlantic, Sara Watson said the controversy centers on the question of whether data science can consider itself a science. Sociologist Janet Vertesi said this study points up the larger issue of increasing corporate funding of academic research. Microsoft researcher Kate Crawford called for future experimental studies to be made opt-in, and at Wired, Evan Selinger and Woodrow Hertzog urged the development of a "People's Terms of Service Agreement."
A BIG COURT WIN FOR BROADCAST: Aereo, a startup that allowed users to pay to stream over-the-air television by renting tiny antennas, lost its case in the U.S. Supreme Court last week in a big victory for broadcasters. In its majority decision, the court stated that Aereo was not so much an equipment provider (as the company claimed) as a cable system that transmitted copyrighted content. Cable carriers have to pay retransmission fees for the over-the-air networks they broadcast, which Aereo was trying to avoid. Aereo suspended its service in the wake of the decision while it determines if it can find a way to continue, while its streaming-TV competitors began to move in on its spot in the market. Techdirt's Mike Masnick, Public Knowledge attorney John Bergmayer, and Notre Dame law professor Mark McKenna all critiqued the legal foundations of the decision, concluding that its vague definition of why Aereo was substantially like a cable system provides little guidance for future cases and leaves the door open for a raft of legal challenges and differing conclusions. At The Guardian, Julian Sanchez argued that if future courts don't care much about technological differences between Aereo and cable systems, the ruling's precedent could endanger a whole range of cloud-based services, and Vox's Timothy B. Lee made a similar point about the perilous future of cloud storage. Gigaom's Derrick Harris said the impact on cloud services won't be as severe as feared, but DVR could be challenged. Fox has already used the Aereo decision to support its case against a streaming-TV service by Dish, and Variety's Ted Johnson looked more closely at several possible outcomes from the ruling: rising TV bills and retransmission fees, more timidity among startups, and a broader legal definition of what constitutes a "public performance." Forbes' Sarah Jeong said we'll never know the innovative startups we've lost as a result of this ruling. Recode's Peter Kafka said that while the decision helps the TV industry in the short run, it could hamper its development in the long run, since a legal Aereo would have pushed it to innovate more aggressively in light of its inevitable disruption. Instead, he said, "they’ll be sticking with lucrative business as usual for now. Pretty sure we’ve seen this show before." Michael Learmonth of the International Business Times made a similar point. At the Columbia Journalism Review, Sarah Laskow said local TV news may have avoided catastrophe with the ruling, since a decision in Aereo's favor may have eventually meant reduced retransmission fee revenue or even a move by the networks to pay TV.
RESOLUTION AND CONTINUED QUESTIONS IN HACKING CASE: After at least three years at the center of the British media spotlight, News Corp's phone hacking scandal reached something resembling a denouement last week, when the trial of two of its principal figures concluded with the acquittal of Rebekah Brooks and the conviction of her deputy, Andy Coulson, on a conspiracy charge. Brooks, the former head of Rupert Murdoch's British newspaper holdings, proclaimed herself "vindicated" by her acquittal amid speculation News Corp might deploy her to Australia. Coulson, the former editor of the now-defunct News Corp tabloid News of the World, was hired as Prime Minister David Cameron's spokesman in 2010, a move for which Cameron apologized last week. News Corp's trouble is certainly not over, though. Scotland Yard informed Murdoch it wants to interview him in their investigation into the phone-hacking case, and in the U.S., the FBI is still investigating whether anyone from the company may have broken American law. The Daily Beast's Peter Jukes reported that the FBI has 80,000 emails from News Corp's New York servers, and the Columbia Journalism Review's Ryan Chittum said that while it will take quite a bit of firepower to go after Murdoch, his potential influence is being substantially diminished. At USA Today, Michael Wolff noted how Murdoch was distanced from Brooks' and Coulson's trial, and The New Yorker's Ken Auletta wondered whether the British tabloid press will be chastened by the embarrassment that the trial was for their industry. At The Guardian, Suzanne Moore said the scandal exposed the coziness between British journalists and politicians, and The Economist said it diminished the political importance of the British press. The Telegraph argued that the trial was an underwhelming spectacle that ultimately showed there isn't a conspiracy among the press against the public, but in a scathing review of the scandal and the trial, The Guardian's Nick Davies said that despite the not-guilty verdicts, the News Corp newspaper empire's corruption and coarsening of British public culture was on full display. The Independent's Cahal Milmo and James Musick also reviewed News of the World's behavior in the scandal, emphasizing its willingness to cut ethical corners in order to land scoops. The Guardian also expressed its hope that the era in which the British tabloid insisted that there was no right to privacy had ended. "In its place should come respect for the universal right to privacy, honoured by all those who wield power – a mighty news company no less than the state itself," the editorial stated. The Guardian's media columnist, Roy Greenslade, criticized the British press for its shoddy coverage of the case.
EGYPT JAILS THREE JOURNALISTS: Three Al Jazeera English journalists who had been arrested in Egypt in December were sentenced last week to seven to 10 years in prison on dubious terrorism-related charges after a surreal and chaotic trial. The Guardian had a vivid account of the verdict, while The New York Times focused on the response by the U.S. government. Journalists around the world rallied to the jailed trio's cause, including protests by hundreds of journalists in London. The Committee to Protect Journalists condemned the verdict as a politicized result with no connection to the law, asserting that "Egypt cannot be allowed to normalize its international relationships so long as it continues to jail journalists." Despite the pressure from numerous Western governments, Egyptian president Abdel Fattah el-Sisi said he wouldn't interfere with the court's decision.
READING ROUNDUP: A few of the other stories and discussions that have merited some attention over the past couple of weeks: — Poynter's Andrew Beaujon reported that The New York Times will close more than half of its blogs, including its aggregative news blog The Lede, as part of a long move away from blogs at the paper. Gigaom's Mathew Ingram expressed concern that The Times will lose some of the innovative drive that came with the blogs, though Times public editor Margaret Sullivan said moving away from blogs could be a good thing for The Times to encourage continual experimentation, as long as its journalists can integrate what they've learned from them into the rest of their work. Blogging pioneer Dave Winer said The Times' blogs were never truly blogs because they were edited and impersonal, while PandoDaily's David Holmes countered that we shouldn't worry about what's blogging and what's not. — SCOTUSblog, one of the top sources of U.S. Supreme Court news and analysis, had its appeal for a congressional press pass from the Senate Daily Press Gallery denied last week based on concerns about it independence from the law practice of its publisher, Tom Goldstein. Goldstein wrote a defense of his site's credentialing case, one echoed by Talking Points Memo's Josh Marshall and Techdirt's Mike Masnick. The Columbia Journalism Review reviewed the history of SCOTUSblog's application to the Senate press gallery to critique the gallery's decision. SCOTUSblog also got support from the Newspaper Guild-CWA. — Upworthy released the source code for its preferred metric, attention minutes, which focuses on time spent on a site rather than number of visits or shares. BuzzFeed explained what's in it for Upworthy, and Digiday's Ricardo Bilton, the Columbia Journalism Review's Fiona Lowenstein, and Gigaom's Mathew Ingram all looked at what other publishers think of using attention as a primary metric. — Finally, the Columbia Journalism Review went deep into Jeff Bezos' efforts to restore The Washington Post's global ambition. It's a lengthy, well-reported look at some important changes underway there.
Photo of Facebook dislike by Owen W Brown used under a Creative Commons license. Photo of Al-Jazeera English producer Baher Mohamed, acting Cairo bureau chief Mohammed Fahmy, and correspondent Peter Greste by AP/Heba Elkholy.
It's tough to find a place with more news change than Portland, Oregon. "The newsonomics of Advance's advancing strategy and its Achilles heel"). That's meant cutting back home delivery to three days a week, publishing a skinny newsstand edition the rest of the week, letting go of dozens of veteran editorial staff, and moving to a compact print format (video here). That's a whirlwind of change. Now into that rearranged topography steps Mark Katches, named Monday as editor of The Oregonian. It's a surprising choice to many in the trade. Katches is a highly respected investigative editor, with two Pulitzers on his resume and a further five nominations on projects on which he's been involved. He's been out of the daily trade for five years — having served three years at the Milwaukee Journal Sentinel and 10 years before that at the Orange County Register. Center for Investigative Reporting, based in the Bay Area, innovating strongly in real multimedia public service journalism. He's broken lots of stale boundaries in the old business, developing new modern ways of reporting the news ("The newsonomics of a single investigative story"), as CIR has used everything from deep databases to coloring books to do and distribute its work. Katches leaves an organization that's been at the forefront of high-value journalism, partnering with the likes of Frontline, The Washington Post, NPR, and Marketplace. The CIR (merged with California Watch) model: collaborative, partnered, data-driven, addressing-public-ills-with-solutions, accessible multimedia journalism. What does he plan to bring from his experience? "We're going to be aggressive about breaking news and watchdog reporting," he told me Wednesday, flying back to the Bay Area after being introduced to the newsroom Wednesday afternoon. "And we're going to figure out more cool, creative ways to engage with readers on topics they care most about." His task starts on July 21. He comes to a paper depleted, transforming, and fighting to maintain its longtime No. 1 presence in the new Oregon news landscape. The Big O has won six Pulitzers since 1999, but only one since 2007, last year's for editorial writing, and it's been plainly wounded by its fast-twitch flipping of the print switch. Its staff has been radically changed over the last year. As many as four dozen journalists left in the cuts accompanying the print cutbacks. Some number — stronger in youth and internship — have been hired back into cut slots. Among recent hires, encouragingly, are two well-regarded journalists: environmental investigative reporter (and Voice of San Diego alum) Rob Davis and state government reporter Jeff Manning, who returned to the paper. Former editor Peter Bhatia, who resigned in March and is teaching at Arizona State University, focusing on News21, had made a point of maintaining a core reporting corps of 90. It's unclear if that number holds. I recently spent a couple of days in Portland, talking to major players there. It's remarkable how much the news field there is morphing — in part because of the perceived vacuum offered up by The Oregonian's cuts. * OPB, OREGON PUBLIC BROADCASTING, is a hard-charger among mid-sized public radio/public media stations, itself doing the kind of multi-platform work in which CIR has excelled. The radio/digital news operation, an outgrowth of its joint ownership of both public radio and public TV, is now a player in local and statewide news — previously an area The Oregonian, long a statewide paper, had to itself. Its news staff totals about 40, with 10 devoted to digital. In certain areas, like its EarthFix Northwest regional environmental coverage, its staff of seven full-timers easily outnumbers the competition. In fact, it is in regional collaboration that the operation serves as a national model, one recently the subject of a Corporation for Public Broadcasting board meeting in Portland, at which I spoke. OPB leads both the radio-oriented Northwest News Network and the digital-oriented Northwest News Partnership. Portland, a metro area of 2.3 million, has no full-time AM all-news radio station, offering up another advantage to OPB. * "The newsonomics of the for-profit move in local online news") just hired Rick Daniels, former president of The Boston Globe and chief operating officer of GateHouse, as COO, signaling an intent to grow more media partnerships and build out the business generally. * THE PORTLAND TRIBUNE, whose owner Pamplin Media Group now runs 25 local titles overall, recently doubled its frequency, adding a Tuesday addition to its Thursday one. * WILLAMETTE WEEK, the Pulitzer-winning alt-weekly, is expanding its mobile presence and seizing on print and digital city guide opportunities. It has increased its free weekly printing order, and ramped up content offerings, especially with blogs. Against that emboldened competition, The Oregonian is regrouping — and Katches' appointment is both a real and symbolic point in that strategy. How is the regrouping going? It's hard to say given the private company's very selective release of favorable data. Its digital reading numbers are up impressively, as we suspected they might, given the staunch of print. Advance Local president Randy Siegel recently touted The Oregonian's OregonLive as a leader of the Advance pack, at 40 percent growth in pageviews. We don't know its digital ad revenue growth, but can peg in at somewhere between a third and a half of the audience growth. Almost alone among U.S. chains, Advance's newspapers are still free online, pumping pageviews but receiving no all-access or digital-only reader revenue. It is, of course, in print that the loss shows. On Sunday, the paper is now under 200,000 for paid copies. It hits a high of 170,000 on full-day weekday editions. For an historical view, consider what those numbers were in 1997: daily circulation of 360,000, 450,000 on Sunday. In early April, it began printing its new compact edition, following up on the home delivery/printing cuts that took effect Oct. 1. On full days (Wednesday, Friday, Sunday), the new stapled paper makes a substantial impression. On Mondays, Tuesdays, and Thursday, it can be a remarkably skimpy 12 to 16 pages in total, almost embarrassing itself as an afterthought of throwaway print. Saturday is a "bonus" delivery day, heavy on sports and ads. We can figure that profit is probably up, but not by a lot. Print volume loss is greater than projected, eating up circulation revenue. "The newsonomics of selling Main Street"), and it is beginning to pay dividends in digital ad income. In the newsroom, there's necessarily a big change in digital-first thinking, and it's very much a change in progress. It's been accused of setting up page-spinning daily output/blog quotas for its staff. No doubt that's been a double-edged sword. There is a greater news intensity, and The Oregonian is on top of areas it wasn't on top of before. Meanwhile, readers say that in other areas, coverage is noticeably decreased, and sometimes just missing. Again, it's not a one-for-one replacement of news reporting and analysis. Less can be less. What can't be easily measured: good will, community clout, and agenda-setting abilities. By all those measures, The Oregonian is diminished. For decades, it was the statewide presence, the straw that stirred the drink in a state of two million and more (and now almost 4 million). Yes, the rest of the state's newspapers have grown smaller, and less impactful, as well over the last decade as well. Yet it is The Oregonian's seeming shrinking from the public square that stands out the most. It is also what makes it seem more vulnerable to the increasing Portland competition. That competition is real and indicative of a future we seem to be moving into in many U.S. cities: the singular power of a daily newspaper is decreased, and the less impactful power of a number of smaller publications grows. It's not a one-for-one replacement, and its ability to hold powerful interests accountable is uncertain. Of course, there's one big journalistic question: What now is The Oregonian's commitment to longer-form enterprise reporting? Katches' appointment is meant, in part, to address that, and to rev up the slimmed-down Oregonian. The new editor should hit the ground running. "The skill he really nurtured here was thinking across all platforms. We didn't have a newspaper or a high-traffic website. We had to do new things, so we had to experiment with storytelling. Mark was in the middle of all that," says Robert Rosenthal, CIR executive director. Rosenthal credits Katches' leadership on two projects in particular, Charity Checker (done in partnership with the Tampa Bay Times) and Rehab Racket (done with CNN). (Stepping quickly into Katches job is Robert Salladay, who has served as CIR's managing editor. He led the impressive "On Shaky Ground" and "Broken Shield" Pulitzer finalists.) Coincidentally, CIR just completed a worthy project — "Averting its eyes, Alabama lets prisons sink into despair" — working as a consultant and collaborator with Advance's Alabama Media Group. "Journalism _with_ people, not just _to_ people," is the description of the work provided by K.A. Turner, director of opinion and commentary for Alabama Media Group. Look for that kind of partnering between CIR and The Oregonian, going forward. The news operation needs all the smart partnering it can muster. Yet, for Katches, the newsroom is what it is. As a newcomer, he can see it for what it is today, and build however possible. That's the thing about new people coming into old organizations: Jeff Bezos didn't look at his Washington Post purchase as buying a newspaper that had lost half its circulation. He saw a top regional brand with ongoing relationships with several hundred thousand readers and advertisers. For Katches, one key question will be his ability to hire. One secret of CIR's success has been Rosenthal's and Katches' ability to pluck the best free agents out of its thick stack of resumes, mixing and matching skills and personalities. In fact, Katches' network has been responsible for more than half of CIR's hires, says Rosenthal. Will The Oregonian, which has recently done its own hiring of younger journalists and interns, in part to replace higher-paid journalists bought out or laid off, give him enough leeway to make an impact with new blood? That's one of the interesting things about the new Advance model. It is religiously digital — its orthodoxy, I still believe, overzealous. Yet it does provide some room, within Advance's management often-prescriptive mandates, for real positive change. With Oregonian publisher Chris Anderson (a molder of the old Orange County Register's growth as both editor and publisher) and Mark Katches teaming up, this will now become a new, very watchable experiment in multimedia journalism.
Photo of Oregonian offices by Josh Bancroft used under a Creative Commons license.
Pew Research Center's Internet & American Life Project is out with a report today based on a survey of leaders in information technology. Earlier, Pew asked experts what digital life would look like in 2025; today's update focuses on potential threats to our information network. The concerns are broken down into four categories:
1) Actions by nation-states to maintain security and political control will lead to more blocking, filtering, segmentation, and balkanization of the Internet. 2) Trust will evaporate in the wake of revelations about government and corporate surveillance and likely greater surveillance in the future. 3) Commercial pressures affecting everything from Internet architecture to the flow of information will endanger the open structure of online life. 4) Efforts to fix the TMI (too much information) problem might over-compensate and actually thwart content sharing.The first two categories have broad implications for journalists — for example, their safety, and how they practice their craft. Journalists will simultaneously have to work to push back against government and commercial control creep while learning how to resist surveillance, as will everyone else. The second two categories, however, speak more directly and immediately to the world of digital publishing. For example, as those commercial pressures mount, it will become increasingly important to make sure that news and information providers understand and have access to the heavy-duty tools of the Internet, so that rapidly consolidating power and money do not overwhelm them:
Glenn Edens, director of research in networking, security, and distributed systems at PARC, said, “Network operators’ desire to monetize their assets to the detriment of progress represents the biggest potential problem. Enabling content creators to easily and directly reach audiences, better search tools, better promotion mechanisms and curation tools — continuing to dismantle the ‘middle men’ is key.”The fluidity of content was also a major concern for some, who look forward to considering access a right, and believe that sharing is the antidote to a fractured Internet:
Clark Sept, co-founder and principal of Business Place Strategies Inc., wrote, “Online content access and sharing will be even better and easier by way of personal digital rights access. Sharing freely will be recognized as having greater long-term economic value than strictly limited controls over ‘intellectual property.’”Or, more briefly:
Jim Harper, director of information policy studies at the Cato Institute, responded, “People are going to get what they want, and they want to share content.”If the risk of an information ecosystem in which content is produced and controlled by a few economically powerful players isn't clear, Doc Searls explains:
“What the carriers actually want — badly — is to move television to the Net, and to define the Net in TV terms: as a place you go to buy content, as you do today with cable. For this they’ll run two-sided markets: on the supply side doing deals with “content providers” such as Hollywood and big publishers, and on the demand side by intermediating the sale of that content. This by far is the most serious threat to sharing information on the Net, because it undermines and sidelines the Net’s heterogeneous and distributed system for supporting everybody and everything, and biases the whole thing to favor a few vertically-integrated ‘content’ industries."But there's a flip side to all that sharing and fluid content, says Mike Roberts, former ICANN leader:
"God knows what will happen to the poor authors. John Perry Barlow says ‘information wants to be free,’ which pursued to the ultimate, pauperizes the authors and diminishes society thereby. There has been recent active discussion of this question on the ICANN former director list."Despite these looming systemic threats, many respondents were concerned about a challenge we already face everyday — how to efficiently locate the information that we want, and how to guarantee that it will continue to be served to us. Writes Michael Starks, an information science professional:
"The challenge will be in separating the wheat from the chaff. Will people who can create, edit, judge, find and curate content for others become valued for those skills? If so — and if that value is reflected in the salaries those people receive — then highly networked populations will have greater access to better content as well as more content.”According to the report's authors, complaints about the sorting process of the future included but were not limited to: "algorithms often categorize people the wrong way and do not suit their needs; they do not change as people change; search algorithms are being written mostly by corporations with financial interests that could sway the ways in which they are being written; search algorithms can be gamed by certain outside interests to sway searches to their advantage." Susan Etlinger of the Altimeter Group added additional concerns:
"With regard to content, the biggest technical challenge will continue to be filter failure; algorithms today just cannot keep up with the number and type of signals that provisionally predict what a person will want at a certain point in time. There are so many barriers: multiple devices, offline attribution and of course simple human changeability. We will continue to see a push and pull with regard to privacy. People will continue to adapt, but their expectations for control and relevance will also increase."A few survey respondents went so far as to imagine the kinds of systems they believe might, or at least should, come into place to help us deal with the deluge of data. Marc Rotenberg, president of the Electronic Privacy Information Center, expressed concerns over consolidated control of the tools we use to find information:
"Currently, approximately 70% of Internet users in the U.S. and 90% in Europe obtain information by going through the search services of one company. This needs to change. There should be many information sources, more distributed, and with less concentration of control. So, I am hoping for positive change. We need many more small and mid-size firms that are stable and enduring. The current model is to find an innovation with monetizing potential, incorporate, demonstrate proof of concept, sell to an Internet giant, and then walk away. This will not end well."What could one type of small firm helping redistribute control of search and distribution?
Jonathan Grudin, principal researcher at Microsoft Research, predicted, “To help people realize their fullest potential, an industry of ‘personal information trainers’—by analogy to personal trainers for fitness—will form to help people find and access information that is interesting and useful for them. Reference librarians played this role when we went to the information repository called a library. As the volume of information continues to grow exponentially, personal information trainers will help us with the much more daunting task of creating a virtual dashboard to access the information of value to us, much of which we did not know was out there.”Ultimately, writers Robert Cannon, U.S. Internet law expert, we are past "the initial utopian introduction that greets the technology with claims of world peace," and past the era of competition. "In the information era," he writes "we have moved into the era of consolidation." But there may yet be hope:
"Unlike other cycles where the era of consolidation also raised barriers to entry, in the modern information era, the barriers to entry still remain low. But this can change as conduit becomes entangled with content or service. [...] This is the core concern of the Net neutrality debate: Will the Internet of the future look like the radio market or the telegraph market after consolidation, with few players controlling content — or will it continue to look like the never-ending marketplace of ideas?”