- Guardian unifies its newsgathering under one domain
- We need to protect the act of journalism, no matter who’s doing it
- This Week in Review: Yahoo goes after Tumblr’s cool factor, and investigative reporting under fire
- More on the tech side of the Reuters.com redesign
- Pew’s new data blog fills in the contextual gaps between information and stories
- The newsonomics of value exchange and Google Surveys
- Who’s reusing the news?
- How the BBC handles responsive images
- Objectivity and the decades-long shift from “just the facts” to “what does it mean?”
- Amazon finds a way to monetize fan fiction
- Jaron Lanier wants to build a new middle class on micropayments
- Internet Archive plans to grow its TV news catalog
- Web video doesn’t always have to be short
- Twitter builds internal guidelines for handling legal battles over patents
- Tuesday Q&A: CEO Baba Shetty talks Newsweek’s relaunch, user-first design, magazineness, and the business model
- At The Miami Herald, tweeting’s about breaking news in the a.m. and conversation in the p.m.
- Isolating the elements of compelling graphic design
- They put the U in UGC: BuzzFeed builds a Community vertical as a talent incubator
- The “death” of “tech blogging”?
- How is algorithmic objectivity related to journalistic objectivity?
- Using the Raspberry Pi to get around newsroom IT
- The value of starting your own news brand — and sponsored content done right
- This Week in Review, Spy vs. Spy edition: Backlash against snooping by DOJ and Bloomberg
- Google’s design evolution
- The New Yorker launches Strongbox. What are the experts saying?
- You have to admire the ambition
- Meet the new class of Nieman Fellows
- Can Newsweek ‘snowfall’ on a weekly basis?
- Rafat Ali’s Skift plans expansion
- Ebook sales are giving book publishers a boost
The Guardian is putting its digital house in order, announcing today the new domain theguardian.com. The new URL will be a universal address, bringing together the US site (guardiannews.com), the UK version (guardian.co.uk), the mobile site (m.guardian.co.uk) and the forthcoming Australian edition. In a conversation earlier this week with Janine Gibson, editor of the US edition of the Guardian, said the new URL is a way to tie together all of the Guardian's operations. "Having one url, one domain, which is the global home of The Guardian, is the most sensible way to grow and expand and make it clear that's what we are," said Gibson. Gibson said it's a sign the Guardian is no longer just a printed paper in the UK, but a company with a digital emphasis and global reach. She said the US is a good example, as the American edition had more than 13 million unique visitors in April, a 38 percent increase over 2012.
In response to some recent events, the Digital Media Law Project took a look today at some landmark legal cases in the history of media protection. They argue that rather than laws that protect journalists, which can be hard to define on an individual level, we need laws that protect anyone engaging in the act of journalism.
[P]rofessional journalists now share the information ecology with a much wider array of members of the public who care about particular communities and issues. These individuals can often speak from deep personal knowledge and identify important information that others might miss. And from the Rodney King incident forward, there has been recognition that sometimes informing the public is not about education and professional commitment, but about being in the right place at the right time. Institutional media organizations still play an important role in conveying information gathered by individuals to the public at large, but the Internet provides many other paths to an audience. The citizens involved in bringing this information to the public don't need to be called "journalists" for the information they possess to have value (although these people are entitled to respect and are free to argue their right to that title). Regardless of names, the manner in which this information of public importance is gathered and conveyed is entitled to no less protection than traditional newsgathering.
YAHOO SNAPS UP TUMBLR: Yahoo approved a $1.1 billion deal to buy Tumblr over the weekend, giving the company, as The New York Times' Jenna Wortham noted, a big stake in the current wave of simpler, more personal and expressive social media platforms. Yahoo pledged "not to screw it up," and its sources told All Things D's Kara Swisher they'd take a hands-off approach to Tumblr. But it will make changes — Yahoo CEO Marissa Mayer said it would introduce more ads to Tumblr's dashboards. A lot of people were immediately skeptical of Yahoo's ability (or willingness) to keep Tumblr un-screwed-up, including GigaOM's Mathew Ingram, who noted, as many others did, that Yahoo was unable to do that with its prior purchases Geocities and Flickr. PandoDaily's Sarah Lacy pointed out that a lot has changed at Yahoo since then (particularly Mayer's presence). Still, blogging pioneer Dave Winer warned that when you sell a company, your buyers can do what they want with it; promises don't matter. YouTube's Hunter Walk offered Yahoo some helpful tips from Google's management of his own site. Several other writers along with Lacy saw this move as a good one for Yahoo. Forbes' Alex Konrad saw it as a success as long as Yahoo maintains a light touch, and Ingram said it made some sense as a desperation move. Reuters' Felix Salmon said the deal looks good from both sides — it allows Yahoo better user data and advertising opportunities, and it lets Tumblr pawn off its profitability problems. Tumblr co-founder Marco Arment said the Yahoo deal will allow Tumblr's founder and head, David Karp, to focus on design and user experience while offloading concerns about maintenance and money to Yahoo. Business Insider's Jay Yarow and Nicholas Carlson cited similar reasons in arguing that this deal works for Tumblr, and reasoned that it also helps Yahoo solve its mobile problem. Tech entrepreneur John Battelle said the key is Yahoo's shift from traditional to native advertising, and Fast Company's Sarah Kessler advised Yahoo to adopt Tumblr's creativity in developing native ads. Fortune's John Saroff listed some reasons to doubt Tumblr's effectiveness for advertising, however. BuzzFeed's John Herrman argued that Yahoo is buying Tumblr for access to its young user base, which sees Yahoo as representing adults, and with them cluelessness and boredom. Ingrid Lunden of TechCrunch asserted that Tumblr's users will jump ship rather than go with it to Yahoo, and WordPress saw a jump in posts at the announcement, potentially (though not necessarily likely) suggesting a migration. Business Insider's Jay Yarow saw a troubling dip in traffic at Tumblr over the past few months, while Timothy B. Lee of The Washington Post saw Yahoo's fixation on young, "cool" users as a sign that's thinking too much like a media company, rather than a tech company. Paul Smalera of Reuters, meanwhile, argued that Yahoo and Tumblr's fundamental clashes in philosophy could be greater than their shared strengths. The Guardian's Michael Wolff was similarly skeptical, writing that the Tumblr purchase is fueled more by desperate investors than anything else: "NOTHING IN YAHOO'S MUDDLED EXPERIENCE AND QUESTIONABLE COMPETENCE SUGGESTS IT KNOWS ANYTHING ABOUT THE SOCIAL BUSINESS OR HAS ANY SENSIBILITY THAT HAS ANYTHING TO DO WITH COOL." Finally, tech entrepreneur Anil Dash offered some thoughts about how Tumblr upended the conventional wisdom on blogging and furthered the form as a result, and Referly's Danielle Morrill wrote that Tumblr could have had an even bigger windfall if it had figured out how to better monetize its user base. IS INVESTIGATIVE REPORTING BEING CRIMINALIZED?: After last week's revelation that the U.S. Department of Justice seized a broad swath of the Associated Press' phone records in a leak investigation, we were hit with another case of the U.S. government snooping into journalists' work in an effort to hunt down leak sources, as The Washington Post reported that the DOJ had named Fox News reporter James Rosen a criminal "co-conspirator" in the leak of classified information about North Korea as a means of getting search-warrant access to his personal emails. Poynter's Andrew Beaujon has a good overview of the developments in both cases; we'll start with the AP then move to Rosen. The AP continued its public outrage at the DOJ's actions, with its CEO calling the move unconstitutional. The Los Angeles Times talked to a media law scholar who backed up the AP's concern, as well as a prosecutor who said the seizure was pretty routine. The New York Times lined up six perspectives on the subject as well, with a couple of them defending the Obama administration. The Huffington Post's Michael Calderone explored the complex relationship between the government and the press that the seizure illuminates, especially when it comes to national security. The president of the Society of Professional Journalists urged Congress to pass the media shield bill that President Obama advocated last week, though journalism professor Chris Daly argued against it, calling for journalists to instead insist on the rights they already have under the First Amendment. Meanwhile, USA Today's Devin Karambelas and David Schick noted that college newspapers often face similar heavy-handed (and often illegal) pressures from administrators. Regarding the Rosen case, NBC reported that Attorney General Eric Holder signed off on the search warrant himself, and Fox's Shepard Smith claimed that the government's intrusion went further — that it "went into" Fox News' servers without notifying the organization. The DOJ maintained, meanwhile, that naming Rosen as a co-conspirator in a subpoena doesn't mean it ever wanted to charge him with a crime, and Obama said they don't think journalists should be prosecuted for soliciting government information. Still, just as they were last week, many observers were appalled at the DOJ's treatment of Rosen. Fred Kaplan of Slate called it "a quantum leap" in the Obama administration's crusade against leakers, The Guardian's Glenn Greenwald called it indefensible (and noted that the Obama administration has used this rationale before with WikiLeaks, though fewer journalists came to its defense), and The Washington Post's Dana Milbank said it's "as flagrant an assault on civil liberties as anything done by George W. Bush’s administration." Mike Masnick of Techdirt said it's clear this is more about waging war on investigative reporting than protecting national security. The common argument was simple: This action criminalizes basic reporting in a way that seriously jeopardizes journalists' First Amendment rights. The Washington Post's Leonard Downie emphasized the chilling effect that this has on investigative journalism, and the Freedom of the Press Foundation's Trevor Timm urged journalists to stand up for their rights. On the other hand, Reuters' Jack Shafer drew attention to Rosen's faulty practices of keeping his sources secret (as well as the neglible value of the scoop). Talking Points Memo's Josh Marshall said he was more shocked at Rosen's ineptitude than the government's actions. A group of former attorneys general (or deputies) defended the administration's fight against leaks in The New York Times. Forbes' Daniel Fisher explained a bit of the conflicted attitude toward leaks from the government's perspective. Others pushed back against the anti-leak attitude: The Washington Post's Chris Cillizza argued that leaks are essential for government to be kept accountable. Conor Friedersdorf of The Atlantic made the point that leaks will always continue as long as public servants believe information should be public, but if the government goes hard after professional journalists, those leaks will go through less established channels, like WikiLeaks and Anonymous. READING ROUNDUP: A few other bits and pieces this week amid the big stories: — Scrollkit co-founder Cody Brown created a replica of The New York Times' famous multimedia feature Snow Fall as a demo of his web design product, but The Times demanded he take it down, then demanded again that he remove any reference to their paper. Techdirt's Mike Masnick defended Brown, while The Times' spokeswoman defended the paper to Poynter's Andrew Beaujon. The episode sparked a lengthy Twitter fight documented by the Columbia Journalism Review's Sara Morrison, and prompted The Awl's Choire Sicha to report that journalists secretly resent Snow Fall for being hyped as the "future of journalism" that everyone should be expected to pull off. — ESPN laid off an estimated 300 to 400 employees this week. The company has been rolling in cash from cable and satellite subscriber fees, which might leave you scratching your head as to why these layoffs had to happen. Business Insider's Tony Manfred and Forbes' Chris Smith pinpointed ESPN's skyrocketing rights fees to broadcast various live sports as the primary cause. — Finally, a couple of interesting pieces at the Lab this week — journalism professor Nikki Usher on the use of Twitter at the Miami Herald, and Jonathan Stray on some fascinating research on the evolution of objectivity toward contextual journalism. _Photo of Marissa Mayer and David Karp by AP/Frank Franklin II._
Building on our piece on the Reuters.com rethink, Source went back to get the nerdier details from Paul Smalera. Of note is that it's all built on an API
called Media Connect (see comment below) that generates the content feeds for all its new platforms and products:
The other thing this setup lets us do is show off the depth of Reuters content. We produce, including videos, text, pictures, and other content types, something like 20,000 unique items per day. But our current website really didn’t let us show off the depth of our reporting. So one of the main functions of the CMS is really set up to allow editors to create and curate collections — we call them streams — of stories. This lets us get to the endless-scroll type behavior that Facebook, Tumblr, Twitter, and the rest have made popular as the new default behavior of the web.
The Pew Research Center launched a new blog earlier this week that's supposed to provide Pew-quality data and information at a real-time pace. It's called Fact Tank, and it will be a home for what Pew calls it's "unique brand of data journalism." Since Tuesday, they've written up data snapshots on topics like Secretary of State John Kerry's approval rating, American support for drone usage, and media coverage of the Oklahoma tornado. Alan Murray left The Wall Street Journal to head the Pew Research Center in November.
What happens when a reader hits the paywall? Only a small percentage slap their foreheads, say "Why didn't I subscribe earlier?" and pay up. Most go away; some will come back next month when the meter resets. A few will then subscribe; others just go elsewhere. So what if there were a way to capture some value from those non-subscribing paywall hitters — people who plainly have some affinity for a certain news site but aren't willing to pay? Welcome to the emerging world of value exchange. It's not a new idea; value exchange has been used in the gaming world for a long time. As the Zyngas have figured out, only a small percentage of people will pay to play games. So they've long used interactive ads, quizzes, surveys, and more as ways to wring some revenue out of those non-payers. It's a variation on the an old saw that says much of life boils down to two things: money and time. It also brings to mind the classic Jack Benny radio routine, "Your Money or Your Life." If people won't pay for media with currency, many are willing to trade their time. Now the idea is arriving at publishers' doorsteps. It is being tested mainly, but not exclusively, as a paywall alternative. Yet, as we'll see it, there may be many other innovative uses of time-based payment. In part, this is part of the digital generational shift we might call "beyond the banner." Static, smaller-display advertising is increasingly out of favor, with both prices and clickthrough rates moving deeper into the bargain basement. But marketers want to market, readers want to read, and viewers want to watch, so new methods that combine the marketing of brands and offers and the go-button on media consumption are au courant. That's where value exchange fits. Publishers are seeing double-digit, $10-$19 CPM rates from value exchange, and that's more than many average for their online advertising. Annual revenues in the significant six figures are now flowing in to the companies that have gotten in early on the business. The big player in publisher-oriented value exchange is Google Consumer Surveys (GCS), a year-old brainchild born out of the Google's 20-percent-free-time-for-employees program (and first written about here at Nieman Lab). GCS now claims more than 200 publisher partners, including the L.A. Times, Bloomberg, and McClatchy properties. It says it has so far exposed some 500 million survey "prompts" to readers. GCS will soon have more company in the value exchange game. Companies like Berlin-based SponsorPay, which offers interactive ad experiences in exchange for access mainly to games, is beginning to pursue publisher possibilities, both in Europe and the U.S, where half of its current clients are based. SponsorPay emphasizes mobile and social in its business. L.A.-based SocialVibe, newly headed by hard-charging CEO Joe Marchese, is an ad tech company. It's mainly oriented to non-newspaper media, especially TV companies. How does this value exchange exactly work? Typical is the implementation at one smaller paper, the Whittier Daily News in the L.A. area., one of some 35 Digital First Media papers (both MediaNews and Journal Register brands) that have deployed GCS almost since its inception. Upon reading their 10th, and last, free metered article of the month, readers get a choice: buy a sub for 99 cents for the first month — or take a survey. "Do you own a cat?" for instance. Publishers get a nickel for each _completed_ response. Response rates tend to fall between 10 and 20 percent. "Completion rates" improve by targeting specific questions to specific audiences. The nickels add up. For publishers, then, we have a new acronym: PAM, Paywall Alternative Monetization. Consider the innovation a by-product of the paywall revolution. If you haven't created a barrier to free access, you have less leverage to force wannabe readers to choose the lesser of two choices to proceed with their reading. Now, publishers can say, pay me for access with money — or with time. The time is short — measured in seconds or maybe minutes, depending on a video's length or a survey's questions. What does the consumer get for answering a question? It varies. Respondents can get as little as a single "free" article, or an hour, or a day of access. These programs can offer side-by-side offers. For instance, someone like a Press+ (which now powers some 380 newspaper sites) may power a subscription offer in one box, and Google Surveys or a SocialVibe can offer up an alternative in a neighboring one. Digital First Media, long a public skeptic of paywalls, is using value exchange as an adjunct to its paywalls, many of which were deployed before DFM took over management of the MediaNews papers. While it is using it successfully as a paywall alternative, says Digital First Ventures managing director Arturo Duran, it's also finding a couple of other ways to wring money out of surveys. At many of its digital properties, including The Denver Post, its photo- and video-heavy Media Center hub offers Google surveys as speed bumps for continued access. Readers perceive value; enough of them are willing to pay with a few seconds of time to keep getting access to visuals. Similarly, Boston.com's The Big Picture "news stories in photographs" uses GCS. This approach, putting up a speed bump — in the form of a survey — instead of paywall explores the nuances of differing consumer valuation of differing parts of news sites. The Texas Tribune has offered a similar approach, having used Google surveys on its extensive data section. How often a survey is deployed can be adjusted by the publisher, working with Google, to maximize both revenue and reduce traffic lost. The search here is for the magic sweet spots. The Christian Science Monitor is also an earlier surveys adopter. "We don't have a paywall," says online director David Clark Scott. "So we tried an experimental speed bump." Those bumps were installed first on a single section, and now have grown, popping up on much of the site. One CSM twist: If you come to the site directly, you won't see the surveys. If you come via some search, social, or other referrals, you will. Digital First is also testing survey deployment for a group notoriously hard for the news industry to monetize: international readers. "We can't sell [ads] in Kenya, Japan, and India," says Duran. Instead of fetching bottom-of-the-ad-network prices, as low as 25 cents, surveys can return money in the whole dollars. One lesson so far: "It's a much better experience than an ad," for many readers, says Duran. Publishers are also finding other ways to get readers to "pay." At the Newton (Iowa) Daily News, the paywall also provides these two alternatives: answer a survey question or a share an article (via Twitter, Facebook, or Google+) in exchange for continued passage. "It wasn't about market research at all — it was about trading time for content," says Paul McDonald, head of Google Consumer Surveys. McDonald, who developed the product along with engineer Brett Slatkin, says they tested out what people would most likely be willing to do, in exchange for some good. They tested a million impressions at The Huffington Post and found that question-answering was the most likable activity. Hence, Google Consumer Surveys. "Most research is stuck in old ways — paper, email, and phone. It's a stagnant industry, " McDonald says. The industry, of course, has responded, offering its own critique of GCS' rapid-fire — surveys can be commissioned and deployed within a day, with complete results, broken down by customized demographics (at an extra cost to survey buyers) within 48 hours — disruption of the market survey space. Still, industry reaction is more than mixed, with the positives of Google's new technique winning adherents among bigger brands and smaller businesses. It's a self-service buying technique, borrowing from Google's flagship AdWords model. Interestingly, Google itself is using Surveys to obtain consumer insight. Yes, the company that derives more data from our clicks than anyone still finds asking a human being a question can yield unexpected learning — which, of course, can be combined with clickstream analytics. YouTube is among the many GCS deployers. It's a new frontier, and one that I think offers a number of curious potentials. * At scale, _if_ there is scale to the business, it's about significant new sources of revenue. * As a paywall alternative, it may be a detour that leads back to the road to subscription. If a reader is engaged enough with a news brand over time — kept engaged in part through value exchange — _maybe_ he or she will eventually subscribe. Does a value exchange-using customer have a higher likelihood of subscribing in the future? It's too early to know, but we may have soon have sufficient data to see. * Value exchange could expand the ability to gain customer data. Each time someone trades some time for reading, she or he could be asked for an additional piece of profiling information. Essentially "registered," that new customer becomes more targetable for subscription offers or advertising. * We can start to widen the idea of trading time for access. Remember the idea of the "reverse paywall," espoused by then-Washington Post managing editor Raju Narisetti and Jeff Jarvis? Spend enough time with a news product, and get rewarded, they proposed. Value exchange begins to structure that kind of relationship, providing value both to readers and publishers. Rough equalization of value would be a painful process, but it may be doable through much experimentation. * Let's combine two things: the rise of mobile traffic and value exchange. Mobile may not be ad-friendly, but customers might be far more willing to watch a video or touch through a quick questionnaire on a cell phone — and that can ring a different key on the digital cash register. "Mobile is already more diversified," says SponsorPay CEO Andreas Bodczek, explaining that it is moving beyond gaming companies for value exchange and will soon include publishers. * GCS is an easily deployable tool for small- and medium-sized businesses. As such, it could be an interesting add-on for publishers' emerging marketing services businesses ("The newsonomics of selling Main Street"). That's a line Google could allow newspaper companies to resell, just as many resell Google paid search.
Derek Willis, interactive news developer for The New York Times, wrote a blog post about a different way to use analytics. Willis says he's interested in tracking and mapping who is citing and quoting the work of major news outlets (like The New York Times).
The idea behind linkypedia is that links on Wikipedia aren’t just references, they help describe how digital collections are used on the Web, and encourage the spread of knowledge: “if organizations can see how their web content is being used in Wikipedia, they will be encouraged and emboldened to do more.” When I first saw it, I immediately thought about how New York Times content was being cited on Wikipedia. Because it’s an open source project, I was able to find out, and it turned out (at least back then) that many Civil War-era stories that had been digitized were linked to from the site. I had no idea, and wondered how many of my colleagues knew. Then I wondered what else we didn’t know about how our content is being used outside the friendly confines of nytimes.com. That’s the thread that leads from Linkypedia to TweetRewrite, my “analytics” hack that takes a nytimes.com URL and feeds tweets that aren’t simply automatic retweets; it tries to filter out posts that contain the exact headline of the story to find what people say about it. It’s a pretty simple Ruby app that uses Sinatra, the Twitter and Bitly gems and a library I wrote to pull details about a story from the Times Newswire API.
As the BBC News site publishes MANY articles everyday, many images are published too. BBC News has an automated process to create 18 different versions of each published image.
If I had only one short sentence to describe it, I'd say that journalism is _factual reports of current events_. At least, that's what I used to say, and I think it's what most people imagine journalism is. But reports of events have been a shrinking part of American journalism for more than 100 years, as stories have shifted from facts to interpretation. Interpretation: analysis, explanation, context, or "in-depth" reporting. Journalists are increasingly in the business of supplying meaning and narrative. It no longer makes sense to say that the press only publishes facts. New research shows this change very clearly. In 1955, stories about events outnumbered other types of front page stories nearly 9 to 1. Now, about half of all stories are something else: a report that tries to explain why, not just what. This chart is from a paper by Katharine Fink and Michael Schudson of Columbia University, which calls these types of stories "contextual journalism." (The paper includes an extensive and readable history of all sorts of changes in journalism in the 20th century; recommended for news nerds.) The authors sampled front-page articles from The New York Times, The Washington Post, and the Milwaukee Journal Sentinel in five different years from 1955 to 2003, and handcoded each of 1,891 stories into one of four categories: * conventional: a simple report of an event which happened in the last 24 hours * contextual: a story containing significant analysis, interpretation, or explanation * investigative: extensive accountability or "watchdog" reporting * social empathy: a story about the lives of people unfamiliar to the reader Investigative journalism picks up after the 1960s but is still only a small percentage of all front-page stories. Meanwhile, contextual journalism increases from under 10 percent to nearly half of all articles. The loser is classic "straight" news: event-centered, inverted-pyramid, who-what-when-how-but-not-so-much-why stories, which have become steadily less popular. All this in the decades _before_ the modern Internet. In fact, previous work showed that the transition away from events began at the dawn of the 20th century. Investigative journalism may have pride of place within the mythology of American news, but that's not really what journalists have been up to, by and large. Instead, newspaper journalists have been producing ever more of a kind a work that is so little discussed it doesn't really have a name. Fink and Schudson write:
…there is no standard terminology for this kind of journalism. It has been called interpretative reporting, depth reporting, long-form journalism, explanatory reporting, and analytical reporting. In his extensive interviewing of Washington journalists in the late 1970s, Stephen Hess called it ‘social science journalism’, a mode of reporting with ‘the accent on greater interpretation’ and a clear intention of focusing on causes, not on events as such. Although this category is, in quantitative terms, easily the most important change in reporting in the past half century, it is a form of journalism with no settled name and no hallowed, or even standardized, place in journalism’s understanding of its own recent past.From this historical look, fast forward to the web era. The last several years have seen a broad conversation about "context" in news. From Matt Thompson's key observation that a series of chronological updates don't really inform, to Studio 20′s Explainer project, to a whole series of experiments and speculations around story form, context has been a hot topic for those trying to rethink Internet-era journalism. I believe this type of contextual journalism is important, and I hope we will get better at understanding and teaching it. The Internet has solved the basic distribution of event-based facts in a variety of ways; no one needs a news organization to know what the White House is saying when all press briefings are posted on YouTube. What we do need is someone to tell us what it means. In other words, journalism must move up the information food chain — as, in fact, it has steadily been doing for five decades! Why does this type of journalism not even have a name? I have a suspicion. I think part of the problem is the professional code of "objectivity." This a value system for journalism that has many parts: truth seeking, neutrality, ethics, credibility. But all of these things are different when the journalist's job moves from describing events to creating interpretations. There are usually multiple plausible ways to interpret any event, so what are our standards for saying which interpretations are right? Journalism has a long, sorry history of professional pundits whose analyses of politics and economics turn out to be no better than guessing. In concrete fields such as election forecasting, it may later be obvious who was right. In other cases, there may not be a "right" answer in the traditional, positivist sense of science. These are the classic problems of framing: Is a 0.3 percent drop in unemployment "small" or is it "better than expected"? True neutrality becomes impossible in such cases, because if something has been politicized, you're going to piss someone off no matter how you interpret it. (See also: hostile media effect.) There may not be an objectively correct or currently knowable meaning for any particular set of factual events, but that won't stop the fighting over the narrative. This seems to be a tricky place for truth in journalism. Much easier to say that there are objective facts, knowably correct facts, and that that is all journalism reports. The messy complexity of providing real narratives in a real world is much less authoritative ground. Nonetheless, we all crave interpretation along with our facts. Explanation and analysis and _storytelling_ have become prevalent in practice. We as audiences continue to demand certain types of experts, even when we can't tell if what they're saying is any good. We demand reasons why, even if there can be no singular truth. We demand narrative. What this latest research says to me is that journalism has added interpretation to its core practice, but we're not really talking about it. The profession still operates with a "just the facts, ma'am" disclaimer that no longer describes what it actually does. Perhaps this is part of why media credibility has been falling for decades. _Photo of Sol LeWitt's "Objectivity" (1962) via AP/National Gallery of Art._
Fan fiction — fan-written stories featuring characters drawn from pop culture properties, like a tale in which Chewbacca and Boba Fett become star-crossed lovers in 1950s New Jersey — is a huge phenomenon. On one end of the scale, the _Fifty Shades of Grey_ books started out as fan fiction and became _the_ publishing success of 2012; on the other, hundreds of thousands of people put their favorite characters into unusual situations in stories posted free on hubs like Archive of Our Own and FanFiction.Net. The problem with fan fiction as a publishing business is that it's of questionable legality. The creators of those characters — the writers of movies, TV shows, and books, or the corporate entities that control their rights — don't want people selling new stories involving them. (Chewbacca's love for Boba Fett was always a forbidden love.) And making the licensing arrangements necessarily to publish fan fiction for a profit was generally too much of a bother for anyone to pursue. The result was that turning fan fiction into a business has been somewhere between impractical and impossible. Amazon took a big step toward slicing that Gordian Knot today by announcing it had made licensing agreements with three fanfic-popular properties — Gossip Girl, Pretty Little Liars, and Vampire Diaries — that will allow fics for those properties to be published for the Kindle, with revenue split between the author and the rightsholder. More deals are on the way.
The new Kindle Worlds platform will enable any author to publish stories based on these characters and then make them available for purchase through the Kindle Store. Amazon will then pay royalties both to the author of the fan fiction and the original rights holder. The standard author’s royalty rate — for fiction that is at least 10,000 words in length — will be just over a third (35 percent) of net revenue.This has _major_ monetization potential — _if_ fan fiction communities used to getting their fix for free (and in an open, episodic environment) buy into the idea of paying for it (or _others_ getting paid for it). FAMILY SELF-PROMOTION ALERT: If you're interested in the subject of fan fiction, you should look into _Fic: Why Fanfiction is Taking Over the World_ by University of Utah professor Anne Jamison, which will be published later this year. (And edited by my wife, Leah Wilson.)
"We're used to treating information as 'free,'" writes Jaron Lanier in his latest book _Who Owns the Future?_, "but the price we pay for the illusion of 'free' is only workable so long as most of the overall economy isn't about information." Lanier argues that a free-culture mindset is dismantling the middle-class economy. In his estimation, the idea "that mankind's information should be free is idealistic, and understandably popular, but information wouldn't need to be free it no one were impoverished." _Who Owns the Future?_, like his 2010 book _You Are Not a Gadget_, is another manifesto attempting to rebuff what he sees as the contemporary ethos of the web. But the followup also refreshingly attempts to pose solutions, one where all participants in this information-based world are paid for what they do and distribute on the web. Throughout, it places particular emphasis on the ways digital technology has unsettled the so-called "creative class" — journalists, musicians, photographers, and the like. As he sees it, the tribulations of those working in such fields may be a premonition for the middle class as a whole. It's "urgent," he writes, "to determine if the felling of creative-class careers was an anomaly or an early warning of what is to happen to immeasurably more middle-class jobs later in this century." I recently spoke with Lanier and we discussed the ways he sees digital networking disrupting the media, why he thinks advertising can no longer sustain paid journalism, and why he misses the future. Lightly edited and condensed, here's a transcript of our conversation.
ERIC ALLEN BEEN: You were one of the early advocates of the notion that "information wants to be free." An idea most media companies initially embraced when it came to the web, and one that now some seem to regret. Could you talk a little bit about why you changed your mind on this line of thinking?_Photo of Jaron Lanier by Dan Farber used under a Creative Commons license._
JARON LANIER: Sure. It was based on empirical results. The idea sounded wonderful 30 years ago. It sounded wonderful in the way that perfect libertarianism or perfect socialism can. It sounds right, but with all these attempts to make a perfect system, it doesn't work out so well. Empirically, what I've seen is the hollowing out of middle-class opportunities and that there is an absurdity to the way it's going. I think we're not getting the benefits that I initially anticipated.
BEEN: When it came to journalism, what were some of those benefits that you originally expected? I imagine you then thought it would be a largely positive thing.
LANIER: Yeah. To use the terminology of the time, we — that is, me and others who were behind a lot of the ideas behind the Web 2.0 ethos or whatever — wanted to "supplant" or "make obsolete" the existing channels of journalism and the existing types of jobs in journalism. But what would come instead would be better — more open and all of that — and less intermediated. What happened instead was a little bit of what we anticipated. In a sense, the vision came true. Yes, anybody can blog and all that — and I still like that stuff — but the bigger problem is that an incredible inequity developed where the people with big computers who were routing what journalists did were getting all the formal benefits. Mainly the money, the power. And the people who were doing the work were so often just getting informal benefits, like reputation and the ability to promote themselves. That isn't enough. The thing that we missed was how much power would accrue to the people with the biggest computers. That was the thing we didn't really think through.
BEEN: Historically, technological advances have caused disruptions to industries, but they've also tended to provide new jobs to replace the wiped-out ones. There seems to be some optimism in a lot of quarters that journalism can get eventually get on the right track, economically speaking, within the digital world. But you don't think so.
LANIER: The system is slowly destroying itself. I'll give you an example of how this might work out. Let's suppose you say in the future, journalists will figure out how to attach themselves to advertising more directly so they're not left out of the loop. Right now, a lot of journalism is aggregated in various services that create aggregate feeds of one kind or another and those things sell advertising for the final-stop aggregator. And the people doing the real work only get a pittance. A few journalists do well but it's very few — it's a winner-take-all world where only a minority does well. Yes, there are a few people, for instance, who have blogs with their own ads and that can bring in some money. You can say, "Well, isn't that a good model and shouldn't that be emulated"? The problem is that they're dependent on the health of the ad servers that place ads. Very few people can handle that directly. And the problem with that is the whole business of using advertising to fund communication on the Internet is inherently self-destructive, because the only stuff that can be advertised on Google or Facebook is stuff that Google hasn't already forced to be free. As an example, you might have a company that makes toys and you advertise the toys on Google, and that might show up in journalism about toy safety or something. So journalists can eek some money from people who sell toys. That's kind of like the traditional model of advertising-supported journalism. But every type of business that might advertise on Google is gradually being automated and turning into more of an information business. In the case of toys, there's a 3-D printer where people print out toys. At some point, that will become better and better and more common, and whenever that happens, what happened to music with Napster will happen to toys. It'll be all about the files and the machines that actually print out the toys. If the files that print out the toys can be made free, the only big business will be the routing of those files, which might be Google or Facebook handling that, and there will be nobody left to advertise on Google. That'll happen with everything else — pharmaceuticals, transportation, natural resources — every single area will be subject to more and more automation, which doesn't have to put people out of work. The only reason automation leads to unemployment is the idea of information being free. It's a totally artificial problem, but if journalists are counting the Google model to live on, it won't work. Google is undermining itself, and there will be no one left to buy advertisements.
BEEN: Speaking of advertising, I'm interested in hearing what you think about a lot of people currently lauding BuzzFeed and its use of native advertising. There's a lot of talk about it solving "the problems of both journalism and advertising at once", or it being some sort of guiding light for a "future of paid journalism."
LANIER: Advertising, in whatever form, just can't be the only possible business plan for information. It forces everybody to ultimately compete for the same small pool of advertisers. How much of the economy can advertising really be? It can't be the whole market. Why on earth are Google and Facebook competing for the same customers when they actually do totally different things? It's a peculiar problem. You're saying that there's only one business plan, one customer set, and everybody has to dive after that. It becomes a very narrow game — there's not enough there for everybody. It could work out locally a little bit, but it's not an overall solution.
BEEN: And your solution is what you call a "humanistic information economy." Could you talk a little bit about how such a system would work?
LANIER: There are some theoretical reasons that lead me to believe that if you monetized a deeply connected open network, the distribution of benefits to people would look like a middle class. In other words, there would be a lot of wealth in a lot of people's hands that could outspend any elite, which is critical for democracy and a market economy to survive. So one benefit is you could get a consistent middle class even when the economy gets really automated. It becomes a real information economy. A humanistic economy would create a middle class in a new way, instead of through unions and other ad hoc mechanisms. It would create a middle class by compensating people for their value in terms of references to the network. It would create an expanding economy instead of a static one, which is also important. It's built around the people instead of the machines. It would be a change in paradigm.
BEEN: In the book, you write: "If we demand that everyone turn into a freelancer, then we will all eventually pay an untenable price in heartbreak." But a lot of what you're proposing strikes me, in some senses, as a freelance economy.
LANIER: That's right. What I'm proposing is actually a freelance economy, but it's a freelance economy where freelancing earns you not just income but also wealth. That's an important distinction to make. What I think should happen is as you start providing information to the network, it then will become a part of other services that grow over time. So, for instance, let's suppose you translate between languages, and some of your translations provide example phrase translations that are used in automatic translators. You would keep getting dribbles of royalties from having done that, and you start accumulating a lot of little ways that you're getting royalties — not in the sense of contractual royalties, just little payments from people that are doing things that benefited from information you provided. If you look at people's interest in social networking, you see a middle-class distribution of interest. A lot of people would get a lot of little dribs and drabs, and it would accumulate over a lifetime so you'd start to have more and more established information that had been referenced by you that people are using. What should happen is you should start accumulating wealth, some money that shows up because of your past as well as your present moment.
BEEN: So if I simply shared a link to a New York Times article on Twitter, for instance, would there be a payment exchange? If so, who would it go to?
LANIER: It would be person-to-person payments. Right now, we're used to a system where you earn money in blocks, like a salary check, and you're spending on little things like coffee of something. And in this system, you'd be earning lots of little micropayments all the time. But you would be spending less often. That terrifies people, but it's a macroeconomic thing. I believe the economy would actually grow if information was monetized, and overall your chances will get a lot better than they are now.
BEEN: You say in the book that this person-to-person payment system is partly inspired by the early work of the sociologist and information technology pioneer, Ted Nelson. Particular, his thoughts about two-way linking over a network. Could you talk a little bit about why you think this is a better way to exchange information?
LANIER: The original concept of digital networking that predated the actual existence of digital networking is Ted Nelson's work from the 1960s. It was different from the networks we know today in a few key ways. All the links were two-way, for one. You would always know who was linking at your website — there would always be backlinks. If you have universal backlinks, you have a basis for micropayments from somebody's information that's useful to somebody else. If the government camera on a corner catches you walking by, and it matches against you, you'd be owed some money because you contributed information. Every backlink would be monetized. Monetizing actually decentralizes power rather than centralizing it. Demonetizing a network actually concentrates power around anyone who has the biggest computer analyzing it.
BEEN: Let's talk about that last point. This is an example of what you call in the book a "Siren Server." That is, computers on a network that gather data without conceding that money is owed to those individuals mined for the information.
LANIER: That's right. It's my name for one of the biggest, best, most effective, connected computers on the network. A Siren Server is a big server farm — a remote unmarked building somewhere in the countryside near a river so it can get cooled. It has tons of computers that run as one. It gathers data from the world for free and does more processing of that data that normal computers can do. What it does with the processing is it calculates several moves that the owners can make that put them in an advantage based on a global perspective. If you're Amazon, it means you keep track of everybody else's prices in the world, including little local independent stores, so you can never be outsold. If a store wants to give a book away, Amazon will also do that, so nobody gets a local advantage. If you're Google, it gives advertisers a way to use a behavioral model of the world to predict which options in front of you are most likely to steer you. If you're a finance company, it's a way of bundling derivatives in such a way that somebody else is holding the risk. It's almost a cryptographic effort. If you're an insurance company, it's a way of calculating how to divide populations so you insure the people who least need to be insured. In all these cases, a giant computer calculates an advantage for yourself and you get a global perspective that overwhelms the local advantage that participants in the market might have had before.
BEEN: In the book, you call Craigslist a Siren Server, one that "created a service that has greatly increased convenience for ordinary people, while causing a crisis in local journalism that once relied on paid classified adds." You write that it "has a tragic quality, since it is as modest and ethical as it can be, eschewing available spying opportunities, and yet it still functions as a Siren Server despite that." So a Siren Server, in your mind, isn't necessarily always a malevolent construction.
LANIER: That's true. I don't think there's much in the way of evil or competitive intent. It's the power of having one of the biggest computers. When you suddenly get power by surprise, it's a seduction. You don't realize that other people are being hurt. But if it wasn't Craigslist, it would have been something else. Some computer gets a global perspective on everything and the local advantage goes away. Craigslist calculated away the local advantage that newspapers used to have.
BEEN: So far, the reviews of _Who Owns the Future?_ have been largely positive. But in The Washington Post, Evgeny Morozov criticized it by saying "Lanier's proposal raises two questions that he never fully confronts." One being whether a nanopayment system would actually help the middle class once automation hits its tipping point. He cites cab drivers being replaced by self-driving cars and says: "Unless cabdrivers have directly contributed to the making of maps used by self-driving cars, it's hard to see how a royalty-like system can be justified."
LANIER: This has to do with the value of information. In the book I ask this very question — in the future, in the case of self-driving cars, it's certainly true that once you've been through the streets once, why do it again? The reason is that they're changing. There might be potholes, or there might be changes to local traffic laws and traffic patterns. The world is dynamic. So over time, maps of streets that need cars to drive on them will need to be updated. The way self-driving cars work is big data. It's not some brilliant artificial brain that knows how to drive a car. It's that the streets are digitized in great detail. So where does the data come from? To a degree, from automated cameras. But no matter where it comes from, at the bottom of the chain there will be someone operating it. It's not _really_ automated. Whoever that is — maybe somebody wearing Google Glass on their head that sees a new pothole, or somebody on their bike that sees it — only a few people will pick up that data. At that point, when the data becomes rarified, the value should go up. The updating of the input that is needed is more valuable, per bit, than we imagine it would be today. Cabbies themselves, that's irrelevant. There won't be cabbies. They'll have to be doing other things.
BEEN: His other question is "how many [online] services would survive his proposed reforms?" Morozov brings up Wikipedia and says the "introduction of monetary incentives would probably affect authors' motivation. Wikipedia the nonprofit attracts far more of them than would Wikipedia the startup."
LANIER: But in what I'm proposing, Wikipedia would not pay you — it would be a person-to-person thing. I'm proposing that there's no shop and people are paying each other when they create things like Wikipedia. Which is very different. If it's going through a central hub, it creates a very narrow range of winners. If it's not, it's a whole different story. The online services that would survive would be the ones that can add value to the data that people are providing anyway. Instagram could perhaps charge to do cool effects on your pictures, but the mere connections between you and other people would not be billable, it would just be normal. People would pay each other for that. The services would have to do more now than they are. A lot of services are just gatekeepers and would not survive and they shouldn't. It would force people to up their game.
BEEN: Speaking of upping one's game, you get a strong sense throughout the book that you think society is no longer future-minded. Towards the end, you write that you "miss the future." What do you mean by that statement?
LANIER: It seems that there's a loss of ambition or a lowering of standards for what we should expect from the future. We hyped up things like being able to network — and we understood it was a step on a path — but these days I call the open-source idea the MSG of journalism. An example would be this: Take some story that would be totally boring, like garbage bags are being left on the street. But if you say, "open-source software is being used to track garbage bags on the street," there's something about it that it makes it seem interesting. And that makes it a low bar for what seems interesting. A very unambitious idea of what innovation can be.
Thanks to new funding from Knight Foundation, the Internet Archive is expanding its collection of TV news broadcasts. The archive also plans to build a better search and user experience around the clips, which can only be viewed online and not downloaded.
The expansion plan is being supported by $1 million in funding from Knight Foundation. With this support, we will grow our TV News Search & Borrow service, which currently includes more than 400,000 broadcasts dating back to June 2009, to add hundreds of thousands of new broadcasts. This means helping inform and engage communities by strengthening the work of journalists, scholars, teachers, librarians, documentarians, civic organizations and others dedicated to public benefit.With TV News Search & Borrow, these folks can use closed captioning that accompany news programs to search for information. They can then browse short-streamed video clips and share links to specific ones.
David Campbell's post pulls together some interesting stats on online news consumption, in particular video:
News is a popular category on YouTube (it was the most searched for item in four out of 12 months in 2011) There is no strict correlation between length of video and popularity — one-third of popular videos were 2-5 minutes in length, and nearly one fifth were longer than 5 mins Oyala, a large video streaming platform, reported that long form videos of 10 minutes+ accounted for 57% of viewing time on tablets they served Multimedia completion rates can also be good: MediaStorm says that more than half, and often two-thirds, of those viewing their stories online stay with them to the end, even when stories run up to 20 or more minutes.
Twitter officially patented its "pull-to-refresh" technology for streaming on its mobile app today, The Verge reports. But Twitter also has an original, internal approach to patent applications. All Twitter's patents include a contract in which the company agrees to engage in patent litigation only if they are sued first. The contract is meant to deal with the concerns of the engineers whose work is being patented, and who feel the definition of defensive litigation can be fuzzy.
"[Engineers] were going around saying we're worried about what patents mean," said Twitter IP attorney Ben Lee, who drafted the IPA and guided it through the revision process. "The IPA is an expression of the values of the company." Lee's work on the IPA began during his initial job interview with Twitter general counsel Alex Macgillivray in November of 2010. "The notion of trying to come up with new ways of handling patents was a major reason for me coming to Twitter in the first place," he said. "I don't think it was that long after that we were already having significant conversations with the engineers and senior management about some things we could do." Unfortunately, work on the IPA was put on hold not long after Lee joined Twitter — a patent troll had sued the company over a junk patent on "virtual communities," and Lee spent serious time living in a Virginia hotel room as the case went to trial. "We've seen the negative impact" of patent abuse, he says. "And we're a young company."
A brand guru. That's what they called Baba Shetty when he was hired away from advertising agency Hill Holliday by The Daily Beast to be the new CEO of The Newsweek Daily Beast Company. Less than a month later, the company announced that Newsweek was putting an end to its print edition and going all-digital. Last week, Shetty released the beta version of the relaunched website, a simple, colorful, responsive, and easily navigable new home for the decades-old news brand. Shetty began working with the magazine on a "Mad Men"-themed issue on retro advertising back in March 2012. So maybe it's not surprising that the new site's first feature article is an exploration of what makes contemporary television so addictive. Shetty has big plans for capitalizing on on the historically respected Newsweek name, blending a New York Times-like metered paywall approach with an ambitious sponsorship model that will see a lot of creative ad work coming off the Newsweek desk. On Monday, Shetty and I spoke about how he sees that plan unfolding, as well as some of his favorite new design features, bringing classic Newsweek covers into the digital space, and why ad agencies should act more like newsrooms. Here's our conversation:
O'DONOVAN: So let's start with the redesign! Congrats, first of all — very exciting.
SHETTY: Oh, thank you.
O'DONOVAN: I'm curious, first, who you were looking to for inspiration with the redesign and what your major goals were.
SHETTY: The audience is a combination of the people who've always looked to Newsweek for its sense of authority, its sense of editorial authority and its stature — its ability to offer perspective on the happenings in the world. But we also wanted to really innovate around the narrative formats for longform publishing on the web. The real story of the Newsweek relaunch is that it allowed us to think about innovation in a way that really hasn't happened much for professional journalism. Actually, there's been a ton of innovation in microblogging and other formats — look at the Tumblr news from the last couple days. Enormous value from thinking about beautiful user experience for content consumption. But really, a lot of the professional editorial products kind of slavishly follow a set of conventions that are all about maximizing pageviews. You look at a long article that might require seven clicks and page reloads to get through — and then there's a lot of display advertising that is competing for attention with the actual content. We thought there was an opportunity to do for professional journalism what Tumblr and Pinterest and Flipboard, so many of the other innovative new startups, have done for other kinds of content. So what we see with Newsweek is the user first. I've been talking about it as user-first publishing. The idea is, let's deconstruct the sense of _magazineness_ — not as a physical thing, but as a concept. The sense of magazineness is about a beautiful user experience. You think about your favorite magazine and sitting in your favorite chair at home and reading it — there's a sense of editorial coherence. You know — the cover communicates a sense of editorial priority, there's a table of contents that lends a sense of coherence to the issue. It's a beautiful package that results. But when magazines go digital, so much of that's lost because of the conventions I talked about before — you slice and dice content into the slivers that we call pageviews, and it's not a very satisfying experience to read professional journalism on the web. So we really wanted to take a leap forward with Newsweek. In addition to the idea of the editorial stature and credibility of Newsweek, also creating a radically creative user experience around that content. I can talk about a few of the features if you think that would be useful.
O'DONOVAN: Yes, but I'm still curious about other projects, other sites, other redesigns, that you might have taken something from, or tried to emulate at all. Or maybe this is a ground zero thing. But for example, The New Republic's redesign, or maybe Quartz — is there a trend?
SHETTY: There really weren't — we didn't really emulate anything. What we were trying to do was stay true to Newsweek and what the ideal user experience would be. The cover — there actually is a cover, and it was static in the first issue, and in future issues it will be interactive, video-based multimedia. It's this idea of drawing a reader in to something that has great editorial to prominence and priority, and we're going to explore what the cover could be in the digital age. There is a persistent table of contents which is available to you at any part of the experience, and that lends a sense of completeness and coherence to this experience.
O'DONOVAN: Yeah, the table of contents gives an element of navigability — it helps you understand the fullness of the product.
SHETTY: Exactly. It's persistent. No matter where you are, in an article or on a page, when you mouse over the window, the table of contents dissolves into view, and you can access it. So there's a sense of, again, an ideal concept of magazineness, and part of it is this sense of complete control over the content consumption experience. So we thought, we'd love to make that real in a natively digital format. Of course, we took account of all the devices that people read on now, so the site is fully responsive and looks beautiful on a handset or tablet screen or — you should really try it on a 23-inch monitor. It's gorgeous in large format screens. It gracefully apportions itself to whatever the screen happens to be.
O'DONOVAN: What would you say, right now, the focus is on in mobile, in building apps? I feel like there's this turn back towards building cross-platform websites and away from apps. Where did apps fall into your priorities when you started compared to where you are now?
SHETTY: Yes, you're exactly right. I think 18 months ago, everybody was talking about native apps as absolutely the way to go. But there's a lot of friction in the app experience, and what I mean by that is apps have to be downloaded, apps have to be used and accessed on a regular basis, apps sometimes make it a little more difficult to share content. People are sometimes not as adept at sharing content via apps as they are across the open web. So for us, it's about giving consumers a choice. We're going to parallel-path for a while — we'll also have a Newsweek app available. But the open web launch we did last week we think is actually a beautiful experience across devices. It's friction-free — there's nothing to download, there's nothing that prevents easy sharing. So it's designed to kind of be — I don't want to say post-app, but it's post- the initial way of publishing thinking, that native apps are the only way to go. I think a well designed, thoughtfully engineered open web experience can be terrific for the user.
O'DONOVAN: You mentioned building an interactive cover page earlier — I'd be interested in knowing what other kinds of engagement you're interested in building across the site. How did you think about structuring comments? How do you want people to respond to the site?
SHETTY: We thought a lot about socially driven content, and if you actually look at an article called "The Way They Hook Us — For 13 Hours Straight," which is about longform, binge-viewing, addictive TV shows — you know, "Breaking Bad," "Game of Thrones," et cetera — if you look at that story, you can see how we handle social. Instead of having commentary being a thing that is relegated to the bottom of the page, there's a set of functionality on the left side margin that moves along with the story. Right now, there's 2,100 opinions listed — it's a way to kind of over time have the idea that engagement opportunities are persistently available, no matter where you are reading these stories — it's not just a thing that's relegated ot the boot of a page. There's a tray that actually slides out to reveal the social features. And there's a lot of innovation we have planned in that area as well. And while we're talking about a long article page, you can kind of see the ability to use multimedia photography, video, infographics to help the journalistic storytelling of a longform piece. That's another, I think, terrific step forward. It's not the tyranny of the pageview, it's not the conventions that are going to deliver more advertising properties — it's thinking about he user first. What's going to make for a great reading experience? in that way, I think it differs from a lot of the conventions that are in play across the web.
O'DONOVAN: So this is my understanding having read a couple things, so correct me if I'm wrong — but your strategy is first to build this product that people are going to want, and then slowly to introduce a paywall, and then later this sponsored content component. Can you explain how you see that unfolding and over what kind of timeline?
SHETTY: I can talk a little bit about it — I probably can't talk about all of our plans right now. The metered access is going to be rolled out fairly soon, and that's just the simple idea that, look, anybody can read any article on Newsweek, and initially that's completely open and completely free. But only subscribers will be able to consume content over a certain number of articles. So it's very similar to what The New York Times and others have done. Open access — we want a lot of social sharing, we want a lot of visibility of the content across the open web. But what we're asking is, if people consume over a certain amount of content, that they subscribe. And that's going to take place fairly soon. The second question is how brands can participate. We have the same principles we've been talking about — thinking about the user first — applied to brand participation. What we're going to do is limit the clutter — relatively few units, but really high impact — but stay with the design aesthetic of the site overall. They're going to be beautiful, unignorable, but the value exchange with the reader is going to be very appropriate. When you listen to a program on NPR, and there's a sponsorship message before the program starts, you can kind of say, okay, well, I get that. I get how that works. It's a reasonable exchange between the audience and the brand that sponsors the content. That's really the model. It's not as much about the standards of display advertising that have dominated the discussion on the web. It's a sponsorship model — a different direction.
O'DONOVAN: From a structural standpoint, in terms of building the sponsorship and how closely married they may be to the content you have, I'm curious if it's going to be an internal team and how closely they'll work with the editorial team, or if it's someone from outside. How does that all work?
SHETTY: Oh, it's all part of one organization in our company, and it's a close partnership between the editorial and business sides.
O'DONOVAN: I was just reading earlier, you wrote, along with someone else, a piece for the Harvard Business Review about how advertising companies should act more like newsrooms. I was hoping you could explain that theory and maybe, I'd be curious to know if that was an idea that started to percolate for you having been in a newsroom for a little while.
SHETTY: It actually started percolating for me well before I came into a newsroom. I think it actually a pretty clear direction that has been well represented by a lot of people. There's a real opportunity for smart brands to publish content that's useful, interesting, engaging, and helpful to their audience. It's not a new idea — in fact I always talk about the fact that it's an idea that's been around for a very long time. But what's changed is all the tools that are available for content creation, distribution, measurement and all the channels that are available to brands. I think it's a very powerful idea. I don't think it's one of these trend-of-the-season ideas. I think it's a dramatic industry shift that we're going to be tracking for years to come, through various iterations. That was something I did with Jerry Wind, head of the Future of Advertising Program at Wharton. It was really based on the Wharton 2020 Project, which was asking a lot of advertisers about what they think about the future of advertising, and it was such a consistent theme — that it's going to be less and less about what we think of advertising today, and more content that is voluntarily consumed by people because they view it as in some way useful or interesting.
O'DONOVAN: As we continue to see this trend toward sponsored content and cooperation between advertisers and news brands, I'm curious what your advice might be to other people who are following a path similar to yours — coming from the ad side and moving into newsroom, operating as the person who is trying to bring those two things together. Are there any specific challenges or surprises there? How would you tell someone to pursue that?
SHETTY: I would just say think about the user first, and by the way, think about editorial standards. It doesn't serve anyone to have editorial standards compromised. Users don't want that, the consumer doesn't want that, and certainly it doesn't benefit the editorial side of things either. Nobody wants that. I think full transparency and good judgment are critical here.
O'DONOVAN: How do you telegraph that to the reader?
SHETTY: Well, we don't really — we haven't really had any issues with telegraphing that. It's just kind of clearly indicating where, what the source of a particular piece of content is. I think as long as you maintain these kind of standards, there really aren't issues.
O'DONOVAN: And in terms of the user-centric experience you're trying to build — you're talking about how modern newsrooms have so many different kinds of metrics available to them now — when I hear people talk about building new products like this, they talk about building something light and flexible, and prototyping it so you can really respond to the audience's initial reaction to it. I'd be curious to know how you're tracking that, how you're listening to the reader, and what kind of flexibility you've been able to build into the product.
SHETTY: Absolutely. The iterative nature of web design development — or I should say, digital design development — is a terrific kind of approach for designing something that users really love and respond to. For us, it's tools like Chartbeat, which we love, and other kind of leading-edge ways of getting real moment-to-moment feedback from not only what people are reading, but how they're spending time with it, where they're coming from, what kind of engagement they have with it. It's all fed right back to the design and development process. It's a long way from the days of just building it and they will come. It's really paying such close attention to what people actually respond to.
Have you ever tried tweeting at a major news organization? How often have they responded or retweeted? Probably not often — and that corresponds to the findings offered by a GW/Pew study of 13 major news organizations which found "limited use of the institution's public Twitter identity, one that generally takes less advantage of the interactive and reportorial nature of the Twitter." So when I went to The Miami Herald as part of a much larger project looking at newsrooms and news buildings, I was pleasantly surprised to find it, like some other newspapers, has actual people manning Twitter — breaking news "by hand," interacting with readers, and having a genuine public conversation over the main @miamiherald Twitter account, with its 98,000 followers. (Aside from Twitter, The Miami Herald is making ample use of its Facebook account, posting new stories once an hour and relying on feedback from the 46,000-plus audience for stories and tips — and as an extension of the Public Insight Network pioneered by American Public Radio.) In Miami, Twitter takes on two distinct modes during the day — in the morning as headline service and in the afternoon as conversation. "In the morning, we try to get the audience between 6 and 8 a.m. on Twitter and on the website," says continuous news editor/day editor Jeff Kleinman, who says he wakes up at 4:30 to begin monitoring the news. Kleinman uses Twitter to break news — whether or not it's on the paper's website. "We want to be first," he noted, as he quickly dashed off a tweet about a boat fire in front of me. More often then not, though, there will be a link to a short two-paragraph story begun on the website. But not always. Miami still remains a vibrant and competitive news marketplace with three local TV stations chasing breaking news, the Sun Sentinel, and even blogs getting in on niche action. So in the breaking-news morning environment, "If something happens, I'll put it up on Twitter, I'll write or have the reporter write two quick grafs on the homepage with italics that say 'More to come,'" he said. "We're constantly updating over Twitter and on the website as news comes in." There's less time for conversation, but Kleinman is especially careful to do one thing: retweet what his reporters are offering from the field to the wider audience. "We're not there, but _they_ are, and Twitter is often the fastest way to say what's going on," he noted. So while the reporters have their own followings, their work gets amplified to a larger audience. Take this example of breaking news:
BREAKING: RT @waltermichot: Neighbors gather at scene of one shot and transported 5644 NW 4th Ave. twitter.com/WalterMichot/s… -- The Miami Herald (@MiamiHerald) May 2, 2013Walter Michot, a former photographer who prowls the city with an iPhone (another story), has frequently broken news on his Twitter account, which has then been retweeted by @miamiherald. The mantra in the newsroom is to tweet, write, tweet, write, perhaps blog, and then write a takeout for the web and perhaps the paper. Later on in the afternoon, Twitter and Facebook take on a more conversational tone. Luisa Yanez runs the @miamiherald account then. She focuses on three key things: curating incoming reporters' work and retweeting it — adding additional substance if necessary; offering updates from the website; and responding to readers. The Miami Herald also offers updates about traffic and weather "as a public service and because people want to know," Kleinman said, so followers might see something like this.
#Weather alert: Severe thunderstorm warning issued for the #Keys until 1:30 p.m. -- The Miami Herald (@MiamiHerald) May 2, 2013And then Yanez will retweet a reader who happens to chime in with a photo, in this case, Marven The Martian (@DaReelMJ), who offers a twitpic of the nasty weather brewing.
@miamiherald Even from the balcony it doesn't get any better as it has started to rain in Sunny Isles. #Weather twitter.com/DaReelMJ/statu… -- Marven The Martian (@DaReelMJ) May 2, 2013The Herald also uses Twitter as a direct way to ask its readers to pitch in for story help:
The Herald is writing about ruling that would allow teens to obtain the "morning after” pill. Please contact aburch@MiamiHerald.com. -- The Miami Herald (@MiamiHerald) May 2, 2013The main Twitter feed doesn't shy away from letting reporters show off their spunk. For instance, on Evan Benn's first story for the paper (yes, they hired someone):
MT @evanbenn: My first @miamiherald story. Can't beat 'em? Eat 'em. Smoked python at invasive-species meal hrld.us/11EXFcO -- The Miami Herald (@MiamiHerald) May 2, 2013That, of course, is what they call in the newsroom an only-in-Miami story. And it prompted some only-in-Miami community conversation:
@miamiherald @evanbenn I would think it would taste like smoked eel but maybe more like gator? Either way, great idea hrld.us/11EXFcO -- Jackie Blue (@JackieBlue4u) May 2, 2013
@jackieblue4u Closer to gator, and smoking it really did make it taste like bacon, or prosciutto. -- Evan Benn (@EvanBenn) May 2, 2013Kleinman and others acknowledge that the tweet-to-web traffic conversion isn't what they'd like it to be. But for them, Twitter is a way to build an audience, establish their continued brand prominence, and carry on a conversation. And while The Miami Herald newsroom might be losing the best view in journalism for a new home by the airport, location might not matter as much as it once did, because their conversation with their audience is virtual. Those who doubt that a newsroom that is struggling with staff and budget problems can handle putting the time and energy into social media should look at Miami and see a case of what's going well. And those who think that community conversation is too hard to handle should also pause and consider the possibilities that do exist when a newsroom engages with its community. Especially if it's about eating python. _Photo of outgoing Miami Herald building by Phillip Pessar used under a Creative Commons license._
What kind of response do we want readers to have? When you build an informative and elegant visualization, how are you hoping they'll react? These are questions that Amanda Cox of The New York Times' graphics desk asks herself on a regular basis. In a recent analysis of their popularity on social media, Cox tried to locate what makes a graphic popular.
1. “development.really.hard” 2. “big.breaking.news.big.breaking.news.adjacent” 3. “useful” 4. “explicitly.emotional…atmospheric” 5. “surprise.reveal” 6. “comprehensive” Unsurprisingly “difficult” topics — mostly related to war, violence, climate change, and other highly complex issues — performed least well, but “takeaway” pieces with an obvious message also performed poorly as a class. In contrast, visualizations that requires extensive technical resources tended to perform particularly well, as did features Cox classed as emotional and useful — and, of course, those closely tied to breaking news. In the wrap-up of her analysis, Cox considered the problem of indicating importance to the paper’s readership across platforms: “How do you signal that something is important? You do that by using the resource that is scarce.” In print, the Times can use scarcity to indicate importance by giving an important graphic a desirable spot on a “good page.” On the web, the equivalent scarce resource isn’t placement, but the allocation of valuable internal tech/development hours.
Lately, the news around BuzzFeed is all about how serious they are. They're getting into longform. They're fixing breaking news. They're hiring big names from The New York Times. But that doesn't mean the good people of BuzzFeed have forgotten where they came from — what community editor (and animals editor!) Jack Shepherd calls the site's "bread and butter."
The department devoted to creating this "old school" content is known as BuzzTeam. Their focus is anything shareable — lists, animals, nostalgia. The kind of content that BuzzFeed's loyal readers have become hyper-familiar with. Many, in fact, have consumed so many such BuzzFeed posts that they've become adept at mimicking both their tone and their viral success. Earlier this month, BuzzFeed's editors took a step toward giving those faithful followers a little more of the spotlight they crave. Shepherd, along with a staff of four, now run BuzzFeed Community, a content-producing vertical of its very own, complete with featured posts by community members and a leaderboard with the latest on who's posts are getting the most traffic, likes, comments, and badges. It's a competitive place, and anyone can join and enter the fray. Success generally comes to those who enjoy making post after post, learning the audience and predicting what they will like. Shepherd says active, productive, and successful communtiy members are the kind of people who are Internet-obsessed. "They're aware of how long a particular image has been around, what its currency is, whether it will come back or whether it's old," he said, laughing. "They'll always let us know. " This community has existed since the early days of BuzzFeed. The site's community moderator, Cates Holderness, was boarding and grooming dogs at a kennel in North Carolina when she first discovered the site. "After a long day at work, posting on BuzzFeed was both a creative outlet for me as well as a way to relax and have fun. I got great feedback from the other users, who were very supportive and encouraging."
#FF --> @buzzfeeders, where the best of the @buzzfeed Community will be featured! -- Cates Holderness (@catesish) May 8, 2013
@catesish @buzzfeeders @buzzfeed You mean like @tonydac??? :D buzzfeed.com/tonydac/9-ways… -- Leslie Stewart (@darlingstewie) May 8, 2013
@darlingstewie@tonydac Yep! Exactly like that! -- BuzzFeed Community (@BuzzFeeders) May 8, 2013
@buzzfeeders @tonydac Tony is just so smart and funny. He was the first friend I had on Twitter and he is just an all around NICE GUY -- Leslie Stewart (@darlingstewie) May 8, 2013
#ff @darlingstewie thanks for being awesome! You’re so supportive and helpful! Internet! -- tonydac (@tonydac) May 8, 2013
@tonydac INTERNETTTT -- Leslie Stewart (@darlingstewie) May 8, 2013Holderness got so good at making posts — and scoring badges from staffers — that she ended up talking with Shepherd about a job, and was eventually hired to manage and interact with the site's growing commenter base. She says the Community staff spends a lot of time engaging with users, encouraging them to make more posts and awarding them various badges. But her greatest power is the ability to promote posts, placing them on the featured Community page, or suggesting them to the editors of other verticals. Holderness says the Community team reads every single post that's made. Now, partially in hopes of finding others like her, Shepherd has opened that community up. The CMS for community members was originally built to allow potential hires to make sample posts, he says. But what the new vertical offers, in addition to centralized Featured Posts and a way to view the Community contributions all at once, is a leaderboard. (At the moment, WhittyGolden is edging out swelldesigner for the top spot, powered by posts like "Ten American Idol Judges Whose Opinion You’d Actually Respect" and "Why It Would Have Totally Rocked To Be A Huxtable.") "Now they have a community board that also lets you see how you rank against other community members," says Community member Alexa Westerfield, "so that brings out my competitive spirit." That sense of competition is key, because along with the new vertical, BuzzFeed community members now also have a new currency for measuring how well they're doing: Cat Power. (Not that Cat Power.) "I know that on Reddit, people get really obsessed with how much karma they have. That totally works for them, but I don't think that's our model in any way," says Shepherd. "Cat Power, actually, it's kind of silly, but it actually correlates directly with an actual thing. The higher your Cat Power is, the more posts you can suggest to our community editors for consideration." Community users can create as many posts as they want, but they can only promote a few. The better their posts are, the more Cat Power they get, the more access to editors they have, which in turn means more promotion, audience, and views. Building a relationship with the editors can be lucrative — Shepherd says there's an active freelancing network to pay users, and lately BuzzFeed fellowships have been given to frequent posters. Shepherd calls the new vertical "small scaffolding to help people naturally" ascend the steps of the hiring process — and he says the Community editing team will expand within the year.
Hundreds of applications for the @buzzfeeduk writer job. Best way to stand out from the pack: sign up + start posting buzzfeed.com/community -- Luke Lewis (@lukelewis) May 16, 2013Those jobs are major incentives for BuzzFeed Community members. A cursory examination of frequent posters finds the vast majority seem to be journalism students, budding comedians, or both. Frequent users said that creating posts for BuzzFeed offers built in access to an audience of a size it would be impossible to reach otherwise. "I had a blog for five years that literally nobody read," says Shepherd. Writes Travis Greenwood, a top ranked Community member currently in marketing: "My community posts have been linked by National Geographic, Time, Huffington Post, Jezebel, Babble (a Disney property), ET.com, Complex, Fark, Mental Floss, Team Coco, IFC, Glamour, and dozens of other prominent sites (in other words, big, established editorial brands that probably wouldn't give me the time of day otherwise). On a basic level, BuzzFeed is a marketplace of ideas and I want to compete in this arena. Pushing something onto the homepage is like jumping onto the diamond with the Red Sox (I'm from Boston)." Of course, media companies hiring popular commenters as writers is nothing new. "We have a long tradition of hiring people who love the site and organically discover how to make really great BuzzFeed posts," says Shepherd. And as Matthew Ingram wrote a few weeks ago, it's happened at The Atlantic, Wired, and Gawker — and Gawker's new Kinja platform seems expressly built for the purpose of surfacing talent. Community members are also using BuzzFeed as an opportunity to learn about audience desires. They get access to a BuzzFeed dashboard that tracks tweets, links, and views, and, through editors, can have their success tracked by BuzzFeed's algorithm. "People love to see posts that reflect them or impact them. That's just a rule of journalism," wrote Austin Carroll. "As a journalism student, I do hope to get hired by BuzzFeed (currently that is in the talks of happening soon). However, I am mostly a narrative televison writer and therefore this experience has given me valuable insight into what people relate to, what they find funny, and what they want to talk about and share." So while they may be creating heavily branded and highly popular content for free, it's not as though BuzzFeed's Community users walk away empty handed. But what is BuzzFeed hoping to will come of the project? "I would love to see a large core of users genuinely competing with the editors. It's always the case with big sites, and it's true for us as well, that the percentage of registered users versus the percentage of people who are dedicated and active — the latter is very small," says Shepherd. "I would consider this to be a success if we got really good at focusing on that percentage and growing that and giving them the attention they wanted and mkaing it a good experience. Getting it to a point where they could kind of compete with the editors." I asked Shepherd whether they're concerned about users repackaging staffer ideas, or vice versa. On Twitter, some voiced even more serious concerns.
Starting a pool: How long before BuzzFeed gets sued for something someone posts in its Community section? -- Nicholas Jackson (@nbj914) May 15, 2013But Shepherd said, while Community wasn't built to help editors pad the other verticals, if users end up making content that's more popular than what staffers make, that's all right. "Maybe from there," he says, "there will be entirely new post types that we never thought of that really work because they're being crowdsourced or being collaborative." _Photo by Roger H. Goun used under a Creative Commons license._
Robinson Meyer is tweeting up an interesting storm about the "death" of "tech blogging" — tied to Gizmodo's announcement of some impressive new hires today. To get past the scarequotes, here's Rob.
With @gizmodo’s massive refresh, this feels obvious but worth saying:“Tech blogging”—as the genre was long defined—no longer exists. -- Robinson Meyer (@yayitsrob) May 17, 2013
TechCrunch established the “tech blogging” form, when—after going with ~any~ start-up news—it launched a whole blog about phones in ~2006. -- Robinson Meyer (@yayitsrob) May 17, 2013
50% start-ups, 50% phones. “Tech blogging” as genre. The NYT’s Bits—a tech blog! at the NYT!—opens shop in 2007: bits.blogs.nytimes.com/2011/11/10/bit… -- Robinson Meyer (@yayitsrob) May 17, 2013
Last Oct., @nicknoted complained about a “general dearth of news” in Gizmodo’s beat: jimromenesko.com/2012/10/15/gaw… Too few new phones, companies. -- Robinson Meyer (@yayitsrob) May 17, 2013
(@alexismadrigal called & covered the *reasons* behind tech journalism’s demise all the way back in April 2012: theatlantic.com/technology/arc… ) -- Robinson Meyer (@yayitsrob) May 17, 2013
Since spring of 2012, you could watch the tech beat break. What were business stories, media stories, policy stories were labeled “tech.” -- Robinson Meyer (@yayitsrob) May 17, 2013
Slow news day? This science story kinda sorta qualifies as “tech.” And, uh, hey! Here’s an architecture story that involves cell phones! -- Robinson Meyer (@yayitsrob) May 17, 2013
With @bldgblog & @paleofuture, @gizmodo enters (my fav) part of tech: gizmodo.com/new-faces-new-… Structure, history, ethics. Weird Techblogging. -- Robinson Meyer (@yayitsrob) May 17, 2013
Tech journalism, ca. 2013: • SAME AS IT EVER WAS: Ars.• COVER WHATEVER MIGHT BE TECH NEWS: The Verge, &c.• WEIRD STRUCTURES: @gizmodo! -- Robinson Meyer (@yayitsrob) May 17, 2013
The death of Web 2.0-flavored techblogging is worth celebrating because—maybe—it’ll end the rampant ahistoricism of technology journalism. -- Robinson Meyer (@yayitsrob) May 17, 2013
The best part of Web 2.0, of “social media,” of new(!) smartphones(!) was the rambunctious hope. But you can hope and know your own history. -- Robinson Meyer (@yayitsrob) May 17, 2013
Media has a history & policy has a history & higher education has a history, all of them riven with tech—yay, cover them! When appropriate! -- Robinson Meyer (@yayitsrob) May 17, 2013You could think of what he's saying as the Revenge of the Liberal Arts Majors. While we're talking about multi-tweet runs, might as well link to Ben Mathis-Lilly's defense of new media terminology.
Today at New York University, a bunch of smart people are gathered at the Governing Algorithms conference.
Algorithms are increasingly invoked as powerful entities that control, govern, sort, regulate, and shape everything from financial trades to news media. Nevertheless, the nature and implications of such orderings are far from clear. What exactly is it that algorithms “do”? What is the role attributed to “algorithms” in these arguments? How can we turn the “problem of algorithms” into an object of productive inquiry? This conference sets out to explore the recent rise of algorithms as an object of interest in scholarship, policy, and practice.If this interests you, I'd suggest following #govalgo on Twitter, checking out the proposed pre-conference reading list, and looking at the discussion papers submitted. One that stood out to me was Tarleton Gillespie's "The Relevance of Algorithms," which connects the idea that algorithms are "objective" to journalists' conception of the same idea (emphasis all mine):
This assertion of algorithmic objectivity plays in many ways an equivalent role to the norm of objectivity in Western journalism. Like search engines, journalists have developed tactics for determining what is most relevant, how to report it, and how to assure its relevance — a set of practices that are relatively invisible to their audience, a goal that they admit is messier to pursue than they might appear, and a principle that helps set aside but does not eradicate value judgments and personal politics. These institutionalized practices are animated by a conceptual promise that, in the discourse of journalism, is regularly articulated (or overstated) as a kind of totem. JOURNALISTS USE THE NORM OF OBJECTIVITY AS A "STRATEGIC RITUAL" (TUCHMAN 1972), TO LEND PUBLIC LEGITIMACY TO KNOWLEDGE PRODUCTION TACTICS THAT ARE INHERENTLY PRECARIOUS. "ESTABLISHING JURISDICTION OVER THE ABILITY TO OBJECTIVELY PARSE REALITY IS A CLAIM TO A SPECIAL KIND OF AUTHORITY" (Schudson and Anderson 2009, 96). Journalist and algorithmic objectivities are by no means the same. Journalistic objectivity depends on an institutional promise of due diligence, built into and conveyed via a set of norms journalists learned in training and on the job; their choices represent a careful expertise backed by a deeply infused, philosophical and professional commitment to set aside their own biases and political beliefs. THE PROMISE OF THE ALGORITHM LEANS MUCH LESS ON INSTITUTIONAL NORMS AND TRAINED EXPERTISE, AND MORE ON A TECHNOLOGICALLY INFLECTED PROMISE OF MECHANICAL NEUTRALITY. Whatever choices are made are presented both as distant from the intervention of human hands, and as submerged inside of the cold workings of the machine. But in both, legitimacy depends on accumulated guidelines for the proceduralization of information selection. The discourses and practices of objectivity have come to serve as a constitutive rule of journalism (Ryfe 2006). OBJECTIVITY IS PART OF HOW JOURNALISTS UNDERSTAND THEMSELVES AND WHAT IT MEANS TO BE A JOURNALIST. IT IS PART OF HOW THEIR WORK IS EVALUATED, BY EDITORS, COLLEAGUES, AND THEIR READERS. IT IS A DEFINING SIGNAL BY WHICH JOURNALISTS EVEN RECOGNIZE WHAT COUNTS AS JOURNALISM. The promise of algorithmic objectivity, too, has been palpably incorporated into the working practices of algorithm providers, constitutively defining the function and purpose of the information service. When Google includes in its "Ten Things We Know to Be True" manifesto that "Our users trust our objectivity and no short-term gain could ever justify breaching that trust," this is neither spin nor corporate Kool-Aid. It is a deeply ingrained understanding of the public character of Google's information service, one that both influences and legitimizes many of its technical and commercial undertakings, and helps obscure the messier reality of the service it provides.The Tuchman reference is to Gaye Tuchman's 1972 landmark piece "Objectivity as Strategic Ritual: An Examination of Newsmen's Notions of Objectivity." The Michael Schudson/C.W. Anderson piece is "Objectivity, Professionalism, and Truth Seeking in Journalism" (2009). The Ryfe is David Ryfe's "The Nature of News Rules."
Matt Waite — ex-Tampa Bay Times and Politifact, currently professing at the University of Nebraska — promotes the Raspberry Pi as a Trojan horse for newsroom IT. (Trojan horse in the sneaky-way-to-get-around-obstacles sense, not in the malware sense.)
Unfamiliar with the Pi? The Model B Pi is a $35 computer that’s about the size of a deck of cards. It’s got an ethernet port, and you supply the hard drive in the form of an SD card, the keyboard, mouse and monitor. Now, for $35, you’re not getting a ton of horsepower, but for simple repetitive tasks it works great. What kind of simple, repetitive tasks? Let’s pretend for a second that you wanted to set up a scraper that dumped data into a database every hour. Ideally, you’d have a server somewhere and you’d set up a task on it — I like using ‘nix’s cron for things like this — and off it would go, mindlessly gathering data for you and putting it into a database. You could then go about your life, stopping by from time to time to get that data and do whatever you’re going to do with it. So you ask newsroom IT for this and, of course, the answer is no. And no we won’t give you the money to run this in the cloud for a few bucks a month either. Enter the Pi.
This piece at Forbes about Boston journalist Justin Rice is interesting for a few reasons: — It describes how Justin, a few years back, started an independent, no-revenue site called BPSsports that covered high school sports in the urban Boston public schools — something local media wasn't particularly interested in covering. After building it up, it was scooped up by The Boston Globe, where it lives on as BPS Sports Blog at Boston.com, with Justin still serving as lead writer. It's a nice example of the value of just _starting_ something and of the opportunities that can open up.
Rice didn’t approach his innovative gig by knocking on doors in a traditional way. He nabbed it by doing the work of BPSsports himself, at first, creating a proof of concept that eventually paid off. He showcased the potential. He eventually reaped the rewards. In that, there’s a time-tested entrepreneurial tradition, but there’s also a takeaway specifically for writers looking to make their own beat, especially in an age of digital news. “Just jump in and don’t hesitate,” Rice said. “If you truly want to occupy a new niche, then you’ve got to claim that niche as quickly as possible, before anyone else does. Throw up what you can, do as much as you can with it. If it truly is a niche, somebody else is going to be interested."— We know Justin around here — he wrote a four-part series for Nieman Lab on the impact of sports networks becoming their own media outlets way back in 2009. — The story isn't actually straight Forbes journalism — it's part of its Forbes BrandVoice program that lets brands "post interesting and relevant content on Forbes.com while tapping into the social web through Forbes’ powerful, search optimized publishing platform." So this piece was presented by…John Hancock, which I guess wants you to think of it whenever you think of urban high school sports? (James O'Brien is listed as the author.) This is the happiest sort of sponsored content — the story has nothing to do with John Hancock and has seemingly no attachment to John Hancock's business interests. (You don't see that on all of John Hancock's sponsored Forbes content — like a story about the wonders of homeownership or one about why you should "check in with your financial advisor at least once a year" about your 401(k).) We hear about it when sponsored content goes wrong, and there are plenty of landmines that need navigating. But it's also worth noting when sponsored content checks in as somewhere between "inoffensive" and "nice."
OUTRAGE AT SEIZURE OF AP RECORDS: The journalism and media world was collectively seething in a way you don't often see this week after the Associated Press revealed that the U.S. Department of Justice had secretly obtained more than two months of phone records from more than 20 of its journalists' work and home lines. The government hasn't publicly said what they're looking for, but it's widely believed to be part of their investigation into the leaker behind the AP's story last year about a foiled Yemeni bomb plot. The two best explanations of the situation come from Poynter's Andrew Beaujon and Free Press' Josh Stearns. The DOJ has moved quickly to defend itself publicly (and to deflect some attention): It wrote a letter to the AP claiming it had the legal right to make the seizure, which drew an indignant response from the AP. Its head, Attorney General Eric Holder, held a press conference in which he emphasized the seriousness of the leak being investigated -- and also told NPR he wasn't sure how many times his department had seized such records of journalists. Holder also testified before Congress, and the White House pushed to revive a media shield bill that would require the government to notify news organizations before their records were seized, allowing them to fight it in court (with some exceptions). New Yorker attorney Lynn Oberlander and Jeffrey Hermes of the Digital Media Law Project both reviewed the law behind the case, finding that while the DOJ might be able to argue for the legality of its actions, it probably violated its own (non-binding) policy for such seizures by not informing the AP beforehand or getting judicial review. THIS CASE, AS HERMES ARGUED, "CALLED ATTENTION TO THE FACT THAT THE DOJ'S MEDIA POLICY HAS SIGNIFICANT PROBLEMS WITH TRANSPARENCY, ACCOUNTABILITY, AND SCOPE." The Washington Post's Timothy B. Lee marveled at the fact that the DOJ's actions are probably completely legal, and warned of the implications for all cell phone and email users. Other writers provided some historical context: The Washington Post's Erik Wemple looked at a couple of past cases to illustrate the difference made when the government gives prior notice, and also examined the long-term effects of its 2001 seizure of an AP reporter's records. Techdirt's Mike Masnick, meanwhile, noted how the DOJ has abused its supposedly careful process for record seizure in the past. Journalists were virtually universally outraged, as The Huffington Post's Jack Mirkinson chronicled in a pair of posts. The DOJ's actions were condemned as a violation of the freedom of the press in pieces from journalists and observers ranging from The New York Times public editor Margaret Sullivan, Poynter's Al Tompkins, Free Press' Josh Stearns, and Slate's Emily Bazelon. The Guardian's Glenn Greenwald, as is his wont, placed the seizure in the context of the Obama administration's ongoing attacks on civil liberties, and the Electronic Frontier Foundation sounded a warning to all of us about the privacy of the communication we entrust to third parties. Marcy Wheeler of Salon broke down the administration's rationale for the leak investigation, arguing that it was motivated by resentment at the AP for pre-empting a planned announcement, and Techdirt's Mike Masnick concurred that it was driven more by embarrassment than national security concern. Reuters' Jack Shafer explained why the government may not have been concerned so much about the AP story's content as the potential damage from its source. SNOOPING WITH BLOOMBERG TERMINALS: The DOJ's seizure wasn't the only snooping story in journalism this week, though journalists were the offender rather than the victim in the other one. The financial news service Bloomberg came under scrutiny with reports that executives from Goldman Sachs and JP Morgan Chase have confronted Bloomberg over reporters' use of its terminals to track terminal usage by their companies' employees. Reporters' access to that information has since been cut off, but the FDIC, Federal Reserve, and Treasury Department are all examining the situation. Not only that, but the Financial Times reported (paywalled; here's The Verge summarizing it) that thousands of confidential terminal messages had inadvertently been available online for years, though they've now been taken down. A bit of background on Bloomberg's terminals: They're everywhere in the financial services industry, and they're by far the largest share of Bloomberg's revenue. Their primary purpose is to track financial data and news, but they can also be used to send messages, and its users' login and customer service data is available to Bloomberg reporters. Bloomberg News editor-in-chief Matthew Winkler downplayed what information reporters have access to through the terminals, but noted that they've used that data as a feedback to tailor their reporting since the early days of the company. The Wall Street Journal's William Launder went into deeper detail on the historical intertwining between journalists and the terminals' financial data, and Quartz's Zachary Seward gave a fuller picture of exactly what Bloomberg reporters can see regarding the terminals' users. BuzzFeed's Peter Lauria reported that a Bloomberg anchor had been disciplined in 2011 for making on-air comments about using terminal data to track a source, and The New York Times' Amy Chozick reported that Bloomberg reporters did talk about using terminal data to help break news. Nitasha Tiku of Gawker gave some more details about how the terminals were used in reporting and the information Bloomberg reporters store and share about their sources. Some observers debated about how big of a deal this snooping was. Both Adam Penenberg of PandoDaily and The Daily Beast's Stuart Stevens compared it negatively to News Corp.'s phone-hacking scandal, but the Columbia Journalism Review's Ryan Chittum and The Guardian's Heidi Moore noted that reporters couldn't get that much information from the terminals, and, as Moore argued, were simply mining data for any minute advantage in the same way their clients were. The most insightful piece on the issue came from Quartz's Zachary Seward, who wrote that the ability to see customers' data was an open secret, a feature rather than a bug for a company built on a borderline obsessive culture of external secrecy and internal "transparency." "DATA COMES INTO THE COMPANY—AS MUCH AS POSSIBLE, FROM WHEREVER POSSIBLE—BUT IT DOESN’T LEAVE BECAUSE, AT BLOOMBERG, INFORMATION IS MONEY," Seward wrote. Former Bloomberg reporter Arik Hesseldahl of All Things D also detailed how deeply ingrained this surveillance is at Bloomberg, and Reuters' Felix Salmon argued that Bloomberg's terminal system is essentially a social network, where, like Facebook, users trade their data for the value the network provides. The Washington Post's Neil Irwin wondered if Bloomberg's model is ripe for disruption, and The New York Times' David Carr tied together the DOJ and Bloomberg scandals, noting that spying is more of a two-way street than journalists like to acknowledge. THE STRUGGLE OVER ONLINE VIDEO: There were a few interesting developments on the online video front worth keeping an eye on this week: ABC announced that it would begin livestreaming its feed through its iPad and iPhone apps to users in the area of some of the local stations it owns. It's the first time a network has offered any type of live mobile streaming, but it's not as much of a step forward in accessibility as you might think: It's only available to cable and satellite subscribers, despite the fact that it's a free over-the-air signal. GigaOM's Janko Roettgers looked more closely at the technology behind live streaming and how it's cleared the hurdles that have held it back in the past. He noted the ways in which the service contrasts with that of Aereo, the service that lets subscribers access streaming network TV on mobile devices, much to the consternation of media executives. Aereo, meanwhile, has simplified its price structure and is expanding from New York into Boston and Atlanta. CNNMoney's Julianne Pepitone said despite the moves by ABC and Aereo, live online TV is still a ways off from becoming a reality for most people. Elsewhere in online video, YouTube debuted its subscription channels last Friday, and as Peter Kafka of All Things D pointed out, it's almost entirely devoid of both big-media players and YouTube-native stars. The Guardian's Dan Gillmor questioned whether people will want to pay for what's offered, and Janko Roettgers of paidContent argued that the key is finding content whose market neatly intersects with YouTube's. A NEW STRONGBOX FOR LEAKS: The New Yorker this week launched Strongbox, a method of securely submitting sensitive information to the magazine, designed by recently deceased digital activist Aaron Swartz and former hacker and Wired editor Kevin Poulsen. It's a pretty complicated process, involving the anonymity network Tor, encryption, and multiple computers and thumb drives. Poulsen explained Swartz's role in creating the underlying code for the process, known as DeadDrop. The most useful analysis of Strongbox comes from Source, where several journalist/developers discussed its advantages and limitations, generally finding it to be a helpful tool that's nonetheless not silver bullet for security, and which may be too complex for many people to use. You can also see some early optimism about Strongbox's viability in posts by the Village Voice's Sydney Brownstone and Trevor Timm of the Freedom of the Press Foundation, with Timm calling it the most promising leak submission system since WikiLeaks. READING ROUNDUP: A few other stories to check out this week: — Protests against the possible sale of the Tribune Co.'s newspapers to the conservative billionaire Koch brothers continued this week in Los Angeles, with another one planned for Orlando. Tribune Co. CEO Peter Liguori, however, tried to reassure employees that a sale of the papers wasn't a foregone conclusion. Rolling Stone's Matt Taibbi said stopping the Kochs from buying the papers is something unions should do, while Poynter's Andrew Beaujon said their potential influence may be overblown. — Poynter's Rick Edmonds reported on surprising new research finding that 92 percent of the time spent consuming news is on legacy platforms — print, radio, TV — rather than computers or mobile devices. Mathew Ingram of paidContent contested the usefulness of the data in illustrating the current state of media consumption. — At MediaShift Idea Lab, Brian Moritz gave four lessons for journalism students from his experiences at Syracuse working with the cutting edge of digital technology, such as drones, 3D printers, and immersive virtual reality tools. — Finally, NYU professor Jay Rosen laid out a blueprint for a networked beat, focusing on how it might work at The Atlantic's business news site Quartz. He also talked to paidContent's Mathew Ingram about his ideas for how to rework the beat with the public at the center. _Photo of phone console at AP's Washington bureau by AP/Jon Elswick. Photo of Bloomberg terminal by Ryan Wang._
The New Yorker's Matt Buchanan takes a look at the design culture of Google and how the company went from a "dizzying and disparate array of products," to the more unified look you see with Google's new releases today.
On Wednesday, the New Yorker launched a Tor- and open-source-based file-sharing tool/tip line called Strongbox meant to allow sources to communicate information to the magazine without fear of it being traced back to them. Since then, Source has compiled a wealth of context and information, including a litany of responses from across Twitter. Today, Source contacted some experts for further exposition. Writes Jonathan Stray:
I am also concerned that the system may still be too hard to use. The Strongbox web service has a simple, clean interface — and bravo for that — but first the user has to get Tor running. In my experience, even savvy technologists vastly overestimate the number of people who can reliably complete tasks like “download and install this software.” If these users don’t also understand why such drastic measures are necessary, they will find ways to accomplish their goals with much simpler tools — like email. Strongbox cannot help users who are too frustrated to get it working properly.And then, there is "the big question hanging over any secure dropbox," writes The New York Times' Jacob Harris: "Will you get any useful tips?"
Now _that's_ crowdfunding:
The Tribune Company which houses the Los Angeles Times, The Balitimore Sun and The Chicago Tribune (along with many other local newspapers) is up for sale! The Bad News: The only people who are bidding on it right now are infamous right-wing Billionaires, who are likely to pay something around a $660 Million pricetag to control a big slice of trusted news media.So let's pass the hat and raise $660 million ourselves!
Yes, we're trying to make a point here. And yes, some might say we're tilting at corporate windmills — but someone's got to do it. We need to get the conversation on media ownership started. And what if by some freak miracle we do begin to approach the ridiculous sum of $660 Million? (That would be weird, but weirder things have happened - trust us.) What if we really do change the game of Billionaires vs. The Rest Of Us? It can't hurt to try, can it? In Short: This Could Be a Game-Changer. (Really.)
The announcement is up at the Nieman Foundation site. It's a great batch of 24 journalists — half from the United States, half from the rest of the world — who'll be spending the next year here at Harvard. (Journalists: It's never too early to start thinking about applying. The first deadlines are a little over six months away.)
Newsweek launched a beta version of their newly redesigned website today. AdAge's Michael Sebastian has more details on the new site, which he calls a "dramatic re-imagining." Along with the new look, NewsBeast has a new plan for how to make money off the former print magazine.
Newsweek.com will be entirely free at the start, but executives plan to eventually introduce a metered pay wall, in which frequent users will be asked to subscribe. All of the content will remain free to Newsweek Global subscribers. No ads will appear on Newsweek.com during the beta stage of its life. When ads do become part of the mix, they will not look like the standard units promoted by the Interactive Advertising Bureau, according to Mr. Shetty. Newsweek.com will instead adopt a sponsorship model featuring one advertiser in each article. "They're going to be bold, beautiful, high-impact units," he said.
Last year we wrote about the launch of Skift, the travel media start-up created by Rafat Ali, the founder of paidContent. Almost at the anniversary mark, Skift has received new funding, and Ali says the site will be expanding and increasing its focus on data: "We are using this funding to double on our staff (from 5 so far to 10, in a month), build out the initial sales infrastructure, and continue building out our data services," he writes.
Julie Bosman reporting for the Times:
In a year that was monopolized by the “Fifty Shades” erotic novels and their various knockoffs, e-book sales in fiction rose 42 percent over the year before, to $1.8 billion. Growth in nonfiction e-book sales was smaller, a 22 percent increase, to $484.2 million. E-book sales in the children’s and young-adult categories increased 117 percent, to $469.2 million. The survey revealed that e-books now account for 20 percent of publishers’ revenues, up from 15 percent in 2011. Publishers’ net revenues in 2012 were $15 billion, up from $14 billion in 2011, while unit sales of trade books increased 8 percent, to $2.3 billion.