Five years ago, I wrote a post about How to Fix RSS (which was my first post to appear at the top of Techmeme). The technology and media landscape has dramatically changed since then, so I’ve updated the simple three-step program, with a particular focus on news organizations.
RSS is NOT dead… it just needs to be reborn:
- Take down the partial text RSS feeds from your website — they are useless, and nobody uses them. (Refer the four people still using them to 2 or 3 below.)
- Post your best content to Twitter and Facebook — they are infinitely more user-friendly, mainstream, and social than RSS readers, making them infinitely more useful and valuable. And keep that old, reliable email newsletter… email will outlive us all.
- Create full text RSS feeds for B2B syndication and partnerships (content sharing, new platform like Flipborad, Ongo, Zite etc.). Rather than hand everyone raw RSS feeds, distribute them through a platform that can provide:
- Tracking and metrics
- Control over distribution to partners
- Support for web syndication (e.g. automate hard-coded links back, add Google syndication-source meta tag)
- Support for all of your business models (without co-opting them)
Lastly, here’s a still relevant (dare I say prescient) excerpt from my original post:
But remember — PEOPLE ARE LAZY. They don’t have the time to put these packages together themselves. The real competition in New Media will be among content remixers. We used to call these editors — the only difference is that remixers will have a nearly infinite diversity of content at their disposal.
From the Publish2 Blog:
Many people have reached out to us recently and asked, “How’s Publish2 doing? You guys have been very quiet for the last few months.” That’s because we’ve had our heads down rolling out the full content distribution service that we announced last summer and launched in beta last fall. And… we’ve successfully launched our business model.
So it’s time for an update on the growth of our content distribution network and our new software-as-a-service licensing business.
The value of any network grows exponentially with the number of participants. So we’re excited to report the our network now includes over 200 news organizations that are actively distributing and acquiring content through Publish2.
We’ve found the key to network growth is members “inviting their friends,” just like on Facebook, which in our case means news organizations inviting their partners. When all of your partners, and news orgs that you want to partner with, are on the network, it’s easy to see the value in joining.
Originally published on Nieman Journalism Lab
Clay Shirky predicts that in 2011 traditional news syndication will see widespread disruption. I couldn’t agree more. But I don’t think the disruption will happen the way Clay describes it.
Clay’s prediction assumes that news consumption will continue its shift from traditional media to the traditional desktop web, where the hyperlink rules and news consumers bounce from hyperlinked page to hyperlinked page and from site to site to site. I think that assumption is wrong. In 2011, we’ll see open acknowledgement of what has long been understood about the traditional desktop web as a platform for consuming news content — it sucks.
The desktop web has been a revolutionary platform in terms of access to information, the democratization of publishing, and the socialization of media. But as a medium for consuming news content, from a user interface and user experience perspective, it’s problematic at best and downright awful at worst. News consumption has begun a major shift from the traditional desktop web to apps for touch tablets for a simple reason — the user experience and user interface are so much better, as the recent RJI survey of iPad users reflects. Consumers are choosing tablet apps over the traditional desktop web based on the quality of the user experience and the overall content “package.”
News organizations are already shifting their strategies to take advantage of that consumer shift. But few have thought about the role of syndication in news apps. With the immersive, hands-on experience of a tablet news app, the value of syndication changes entirely. Apps that deliver nothing but one news organization’s content will not compare favorably with the content richness of the web, no matter how good the UI is. And apps that bounce users around from site to site with an in-app browser, mimicking the traditional desktop web model, will fail for precisely the reason why users chose the app in the first place.
But news apps that can deliver full content, curated from a wide range of sources, within a cohesive, optimized — even breakthrough — UI for news consumption, will win because users will have the best of both worlds. Syndication in news apps will not be about republishing news that everyone else has. It will be about combining curated news with original content in order to create consumer packages that are deeply engaging and in many cases worth paying for. With this shift, news organizations will stop ceding to aggregators the huge value creation of curating and packaging news. Instead, news organizations will start defining their editorial brands as curators as much as they define them as original content creators.
It’s important to note that this new paradigm for news consumption isn’t necessarily anti-web on the back end. It can work with an HTML5 site that creates the same immersive UX/UI as a platform-native app, and can be distributed with an app front end via app stores to support the news org’s business model. Web pages are also still necessary for links shared via social networks. But for a news consumer’s primary daily news consumption — for news orgs they have a direct relationship with — syndication that includes the full content in an immersive app experience will be an essential driver of success.
The other reality that Clay overlooks, on the other end of the news evolution, is that syndication for print newspapers still matters because the print product is generating the cash that’s funding the digital transformation. Reducing the cost of filling the news hole in print with disruptive syndication models will generate more cash for digital. In the near term, that will have a significant impact on how the business of syndication is reshaped.
In that context, here are four predictions for how traditional news syndication will be disrupted in 2011:
Social network for news distribution
Traditional syndication is based on a hub-and-spoke model, where a newswire middleman takes in content from many sources, combines it with original content, and redistributes it. This is an inefficient, obsolete model and will be replaced by a model that has proven wildly successful in the consumer world — the social network.
News organizations have already been forming direct distribution networks to route around the traditional newswire middleman. In 2011, these networks will evolve beyond ad hoc email distribution to become truly scalable in a way that only a Facebook-like platform can enable. News organizations will create a network of trusted sources, the equivalent of “friends,” but where the relationships are based on distribution and the affiliation of editorial brands. I call this the “Content Graph,” the analogue to Facebook’s “Social Graph.”
The business of syndication and news distribution will be reshaped by the power of network effects. Why is that important? Watch this Sean Parker talk.
Human editorial judgment redux
Contrary to Clay’s devaluing of the wire editor’s judgment in selecting content, the value of human curation is actually becoming more important in defining the value of news brands. Google’s algorithm has dominated news distribution on the web (ask any news site what percentage of traffic comes from Google), but it’s being overtaken by social curation — links shared through social networks (ask any news site what percentage of traffic comes from Facebook and Twitter).
The same will happen with news organizations, as editors curate their Content Graph and create better editorial products than any algorithm can ever hope to create.
What social networks have proven is that people value most the judgment of other people they trust. You can trust a friend. You can trust the editor of your favorite news publication. But it’s about people. Syndication based on human curation will prove far more valuable to consumers than syndication based on faceless algorithms.
Free content disrupts again, but differently
The news industry has been disrupted by the explosion of content on the web and having to compete with free sources. A new model for syndication turns this disruption on its head by enabling news organizations to publish free content from high quality web publishers in exchange for branding and links back (a model that Yahoo, for example, has used for years).
News organizations can also barter content with partners to trade the value of content they have already paid to produce for content that they need. Syndication based on a barter economy will be extremely disruptive to traditional newswires charging for content.
Free syndicated content will also help the print product generate more cash in the near-term. (Never underestimate the importance of cash flow in business transformation.)
News organizations take back control
News organizations will increasingly take back control over how their own content is syndicated. This begins with taking back their rights from newswire middlemen, so they can have full control over the business strategy for their content syndication, whether they choose to barter, sell, or keep some content out of the syndication market entirely.
News organizations will also take control over how their content is packaged by aggregators, starting by taking control of their RSS feeds. The first big realization will be that RSS is dead as a consumer technology but has growing value as a B2B syndication mechanism. News organizations will start to take down the consumer feeds from their websites as they realize 99.9 percent of their audience who wants their “feed” is following them on Facebook or Twitter.
Instead of working ad hoc with aggregators and other partners, with no control over their B2B RSS feeds, news organizations will look for ways to more efficiently manage the commercial syndication of their content through a common platform that gives them both control and network scale.
Yesterday, two stories from Aol’s DailyFinance appeared in the Sunday print edition of the Daily Telegram, a newspaper in southern Michigan. These stories appeared on a business page that would otherwise have been produced almost entirely with stories from the Associated Press. The Daily Telegram got permission to publish these Aol stories not through a big corporate content deal, but directly through a peer-to-peer relationship — The Daily Telegram simply subscribed to DailyFinance’s newswire in Publish2’s News Exchange.
Now I’m going to tell you why what you see on this page of the Daily Telegram could play a decisive role in the race between Aol, Demand Media, and Yahoo to win the prize of big brand advertising on the web, and why it is also pivotal to the future of news.
It’s about a big idea that I introduced at TechCrunch Disrupt: The Content Graph — an analogue to the Social Graph, where high quality content brands create a large scale distribution network that could rival search and social media as a distributor of content.
In the Social Graph, you’re defined by your friends. In the Content Graph, a content brand is defined by its distribution relationships with other content brands.
The Content Graph is about leveraging the brand equity and consumer trust that is the greatest asset of every traditional media company. It’s about building a content brand’s reputation through distribution.
The news industry’s business model broke after it lost control over the distribution of news, with news brands suffering one wave of disintermediation after another.
The Content Graph puts news brands back in the game, but not as a return to monolithic monopolies, rather through the power of networks — a network of content brands. (This network includes independent journalists who cultivate their own personal brands.)
Ultimately, the Content Graph could be a map for brand advertising on the web, that enables advertisers to tap into a network of high quality content brands, at scale.
Sound interesting? Let’s dig deeper.
To understand the potential of the Content Graph, let’s look first at the race to become the most efficient, largest scale producer of high quality content in the world.
Demand Media has Aol and Yahoo in its sights to win this race. So declared Joanne Bradford, the new chief revenue officer for Demand Media, and former head of U.S. ad sales for Yahoo, who knows as well as anyone why the race is on — Demand Media, Aol, and Yahoo are positioning themselves for the huge tide of big brand advertising that is expected to flood the web and digital media in the next 10 years. Bradford talks about advertisers like HP and General Mills appearing in “contextually relevant” pages of Demand Media content.
The first race for ad dollars on the web was won by search and its dominance of contextual relevance. But for the next wave of ad dollars, contextual relevance won’t be enough, because these are BRAND ad dollars, and big brands will seek out the most trusted, highest quality content brands.
That’s why Bradford states the goal of Demand Media like this: “We want to be the biggest, best destination for brands.”
It’s about brands, and it’s about quality, as you’ll see in the first paragraph of Yahoo’s announcement of the Associated Content acquisition: “This strategic move extends Yahoo’s ability to provide high quality, personally relevant content for the benefit of more than 600 million users as well as tens of thousands of advertisers.”
And Aol’s goal, according to CEO Tim Armstrong: “AOL is planning on being the largest high quality content producer for digital media.”
Bradford boasts, “I believe our quality stands on its own.”
See the recurring theme? But with all the posturing, here’s the big question: Who will be the arbiter of content quality? How will big brand advertisers decide which content brands they can trust with their brand?
To get big brand advertisers, you’ve got to have reach, but to really get brand advertising at scale, you’ve got to have the highest quality content brands — because it matters to advertisers:
“I think these guys are not building trusted media brands,” Brian Monahan, senior VP at Universal McCann, said of the general landscape of low-cost content. He sees these entities as closer to niche enthusiast magazines rather than newspapers. “They’re never going to take the place of a Condé Nast or some other trusted premium editorial voice,” he said.
Just ask all of the traditional media content brands that still get a disproportionate share of big brand advertising, and whose brand power is a big reason why those dollars haven’t followed consumers onto the web… yet.
It’s no surprise, then, that Demand Media and Associated Content have eagerly pursued distribution deals with the trusted content brands that can burnish their brands in the eyes of consumers and in the eyes of advertisers. Brands like San Francisco Chronicle, Houston Chronicle, Atlanta-Journal Constitution, and USA Today. These are the content brands that already have the trust of advertiser brands, and it’s this trust that the new breed of demand-driven content producers rightly covets.
Demand Media, Associated Content, and Aol’s Seed are aiming to revolutionize the efficiency of content production. But a huge leap forward in efficiency won’t win the race unless they can also build brands that can attract brand advertising. Search may drive enormous traffic, but it doesn’t build content brands (ask any branded content site how well search visitors convert to loyal readers). Social media distribution (i.e. Twitter, Facebook) replaces the value of content brands with the value of personal relationships.
That’s where the Content Graph comes in.
Aol just took a huge leap forward by efficiently leveraging the Daily Telegram’s brand to build the DailyFinance brand, taking the place of the Associated Press, one of the most widely known and trusted content brands. And, critically, this happened without the kind corporate content distribution deal that is entirely lacking in the efficiency that Aol, Demand Media, and Yahoo are pioneering in content production.
For every revolution in content production, there is a corresponding revolution in content distribution. The Content Graph is the revolution in efficient content distribution.
Daily Telegram, a node in the Content Graph, forged a direct connection with DailyFinance. Aol got free branding. And Daily Telegram not only got free content but lifted up its own brand by running better, more interesting content from DailyFinance than they could get from the AP.
Now, imagine DailyFinance content appearing in hundreds of newspapers around the world, rivaling the Associated Press as a primary source of business news — imagine the huge brand equity that Aol could build by being distributed by all of those trusted content brands.
Now imagine every high quality content brand, connected in the largest brand network in the world. Imagine the Content Graph at scale.
The Content Graph defines content quality for newer brands by mapping how their content is distributed by established brands. And it further defines the quality of established brands by mapping how they distribute newer content sources.
Think about how valuable that network would be to media companies for efficiently building and positioning their brands, to ensure that their content is distributed to the broadest possible audience in a way that delivers maximum brand value.
I highlighted the example of the Content Graph in print because there’s a huge near-term opportunity to build the Content Graph through a massive improvement in the efficiency of content distribution to print, and to leverage the value of those trusted brands while print still reaches tens of millions of news consumers.
But the Content Graph is entirely multiplatform, and ultimately a map of a global digital distribution network.
Full content distribution on the web would of course still come with links to the source — the goal is to build the Content Graph in parallel with the link economy. Google could eventually use the Content Graph as a guide, e.g. to identify canonical sources.
The Content Graph also extends to mobile platforms, where publisher apps can be greatly enhanced by distributing content from a wide range of sources. It’s about content brands as curators of the best content, anywhere, not just the content they can produce.
And what is the mutual business benefit for content brands in developing the Content Graph?
Think about the value of the Content Graph to a big brand advertiser, looking to not only maximize their reach on the web, but as important if not more, to maximize their brand value. Imagine how an advertiser evaluating DailyFinance could use the Content Graph to measure the brand’s distribution across of hundreds of trusted newspaper brands… and what if the ad dollars followed DailyFinance throughout the Content Graph?
And that gets to why this is a pivotal moment in the future of news.
A newspaper like The Daily Telegram can get ahead of the curve to benefit from the disruptive power of the web and the efficiency of content created for the web. If they had run this Walmart college story from WalletPop in place of the AP story that appeared on the page, they could have gotten the content for that page entirely for free (along with the Dave Ramsey column), all by leveraging the value of their brand. And it’s by unlocking the value of its brand that newspapers like The Daily Telegram can survive and ultimately thrive in the digital age.
The Daily Telegram can become part of the largest network of high quality brands — the Content Graph — a network formed by the news brands themselves (not a middleman like the Associated Press), which can capture big brand advertising dollars as they migrate to the web.
And that is the future of news… a network of trusted news brands. And the corresponding future of brand advertising is in harnessing the power of this brand network.
This week, at TechCrunch Disrupt, we’re announcing the launch of Publish2 News Exchange, a platform aimed at disrupting the Associated Press monopoly over content distribution to newspapers. With Publish2 News Exchange, newspapers can replace the AP’s obsolete cooperative with direct content sharing and replace the AP’s commodity content with both free, high-quality content from the Web and content from any paid source.
With Publish2 News Exchange, we’ve created what the AP should have become, but can’t because of a classic Innovator’s Dilemma. The New AP is an open, efficient, scalable news distribution platform. We’re enabling newspapers to benefit for the first time from the disruptive power of the Web, and from the efficiency of content production on the Web.
Newspaper online advertising has not benefited greatly from the recent upswing in online ad spending, according to the New York Times and most of the recent newspaper company quarterly results. This is no surprise because most newspaper websites sell SPACE for commodity advertising — display ads and classifieds — and thus are hard pressed to compete with ad networks that specialize in selling commodity ad space by the megaton (or giving it away for free, in the case of Craigslist).
Back when newspapers where the only game in town for ad space, they could charge whatever they wanted. Now the web has near infinite ad space, and newspapers find themselves playing the wrong game. They’ve got ad sales staff that specialize in commodity order fulfillment and not premium advertising solutions.
So what distinguishes a premium ad solution from commodity ad space? It’s a premium solution if not every site can deliver the value. Any site can slap a display ad on a page — that’s what makes it a commodity. High-end brand publishers like newspapers really have only one way to distinguish themselves from every other web publisher on the planet — their ability to create high quality content that attracts a targeted, high quality audience.
But… there are many sites that specialize in creating “good enough” content that can attract segments of that high quality audience, and then selling that audience at a much lower cost.
But wait, you say, high-end brand publishers should be able to sell the ad next to their higher quality content at a higher price. Isn’t that the whole principle behind premium publishing?
Not when it comes to display advertising. Display advertising isn’t more valuable when placed next to premium content because display advertising has so LITTLE value to begin with. In fact, display advertising creates so little consumer value that it actually SUBTRACTS value from high quality editorial content when placed next it. Ever see those belly fat ads on top tier news sites? Dancing Martians lowering your mortgage payments? Whiten your teeth? It’s a total train wreck.
In fact, many ad exchanges are focused on bundling and selling audiences in a way that exploits this commodization of display ads and effectively cuts out the value of the publisher.
So what’s a high-end brand publisher to do?
The answer is to offer advertising solutions that give advertisers the opportunity to create REAL consumer value; the kind of value that complements and even enhances the value of high quality editorial content; the kind of value that high-end brand publishers specialize in creating.
Many advertisers have sought this kind of premium value from high-end brand publishers, and most publishers have responded with customized solutions like the classic “microsite” or one-off customized ads. But that too can be a losing proposition. Case in point from Mercedes:
It was a good day for newspaper Web sites when Mercedes-Benz USA introduced its updated E-Class cars this summer. Mercedes bought out the ad space on the home pages of The Washington Post, The Wall Street Journal and The New York Times, and had those sites create special 3-D ads for them, at an estimated cost of $100,000 a site.
When Mercedes advertises its more basic models next year, it will largely avoid newspaper Web sites and rely on networks. That lets Mercedes “be very targeted and efficient with our dollars,” said Beth Lange, digital media specialist for Mercedes-Benz USA.
The problem with these solutions is they don’t scale — they are expensive for publishers to deliver, and they are expensive for advertisers to buy. The result is most advertisers are lured back by the siren song of commodity ad network cost efficiency. So while high-end brand publishers do well for big splashy launches, they can’t compete when advertisers go into the post-launch mode of consistent, continuous, high ROI value creation.
What high-end publishers need is a way for advertisers to create premium value for consumers that scales and can deliver a consistent, continuous ROI that justifies a premium over commodity ad networks.
What would advertisers be willing to pay a consistent premium for? The holy grail of every advertiser — to become media, i.e. to create high quality content that attracts and retains an audience of current and prospective customers. Advertisers would also pay a premium to align the value that they create for the consumer with the value that high-end brand publishers create for consumers — just like on a search results page, where the ads are as valuable as the “editorial” content.
But if every high-end brand publisher tries to deliver such a solution by themselves, it won’t scale for advertisers. The key is to scale across many high-end brand sites while still delivering the kind of premium value that commands premium pricing.
That’s the next generation of premium online advertising. More in my next post.
In response to the launch of Google’s Fast Flip, I observed that Google is correctly focused on creating a new user interface for news, when most media companies are not. A lot of people responded that Fast Flip is not an innovative or effective UI for news — which may be true, but that misses the point entirely.
It doesn’t matter so much whether Google succeeds or fails with this particular experiment. What matters is that they are trying to solve the right problem.
The challenge for media companies is not to figure out what to do with their content — content in and of itself doesn’t matter. It never has.
It’s all about the package.
Newspaper articles don’t matter without a newspaper. Magazine articles don’t matter without a magazine. TV shows don’t matter without a broadcast or cable channel.
Newspapers’ inability to generate the same revenue online as in print has nothing to do with content. It’s because on the web they are no longer in the business of packaging content, and that’s what the newspaper business, like every other media business, has always been about. Instead, media companies put their content on the web and let search and other aggregators package it.
An individual content item on the web, without a package, has marginal value approaching zero — and attempting to charge for an individual item of content is unlikely to change that. What you CAN charge for is the package.
Media companies need to be doing what Google is doing — experimenting with new ways to package content, which in a digital media world means new UIs and new ways to aggregate.
The nature of innovation is that many experiments will fail along the way. The key is to be aimed at solving the right problem.
Focus on the package. Whoever controls the package wins.
Ask newspapers. Or Google.
Oh, and while we’re on the subject of Fast Flip, lots of people overlooked one of the key words in the product name — FAST. Why does fast matter? How long does it take to get a result when you search on Google? Not long at all. In fact it’s darn FAST. (You can even see how long your Google search took in the blue bar across the top of the search results page.)
That’s why it matters — to the tune of $20 billion. Here’s Marissa Mayer on the importance of being fast. Google has the most successful UI and content package in the history of the web, that created one of the most lucrative business models in the history of media, so don’t write them off too quickly.
Google knows a lot about the future of news — more than many publishers. It’s evident in Google’s new product, Fast Flip, which allows news consumers to “flip” through news stories. What’s striking about Fast Flip is that Google is innovating precisely where publishers used to lead innovation.
Fast Flip is a new package for news.
The publishing business has always been about packaging content. Newspapers. Magazines. Newsletters
In digital media, on the web, the news package is now a function of software — which is why Google is innovating precisely where publishers are not.
Fast Flip is, more accurately, an attempt to create a new UI for news — a better way to consume publishers’ content than publishers provide on their own sites.
Most publishers are focused on how to charge for news. But there’s very little talk about how to innovate the packaging of news, much less a new UI for news. There’s very little talk about how people consume news on the web, about the value of aggregating articles from multiple sources, about solving consumers’ problems rather than publishers’ problems.
That’s why Google is taking the lead on figuring out how to create the new news package, and why they will continue to control the lucrative front end of distribution, while publishers are left with far less profitable back end of content creation.
Google is sharing revenue with publishers because Fast Flip goes way beyond linking to actually partially reproducing entire web pages. And publishers will have to be content with the revenue that Google shares.
Unless they finally decide to compete on the real playing field that will determine the future of news and publishing.
It was a busy Monday morning in two corners of the hacker journalist community: EveryBlock is acquired by MSNBC, and Y Combinator announces a “request for startups” to address that whole “future of journalism” question hanging out there in the open air.
Want to catch up?
Msnbc.com acquires local news Web site
MSNBC.com | August 17, 2009
Ryan Sholin says: MSNBC acquires Everyblock. This brief includes a reminder that they bought Newsvine some time ago. Not a bad stable of news sites to have around.
Tags: Media & Journalism, EveryBlock, msnbc, Adrian Holovaty, hyperlocal
MSNBC.com acquires EveryBlock
blog.everyblock.com | August 17, 2009
Ryan Sholin says: From Adrian’s post at the EveryBlock blog: “MSNBC.com has hired our whole team, and they’ve made it clear to us that we’ll be driving the site’s strategy and implementation, and that our site will remain an independent destination as a community service.”
Tags: Media & Journalism, Adrian Holovaty, msnbc, EveryBlock
knightfdn: Wondering if EveryBlock’s code remains open-source? Yep. Download it at the links posted here: http://kflinks.com/everyblock
Ryan Sholin says: The source code, as it was when the Knight News Challenge grant expired, will remain available. I wouldn’t expect to see an open-source fork maintained by the crew now employed by MSNBC, though.
Tags: Media & Journalism, EveryBlock, knight news challenge, msnbc
Msnbc.com acquires EveryBlock, what it means for local media
Lost Remote | August 17, 2009
Ryan Sholin says: Cory Bergman of LostRemote and MSNBC on the EveryBlock acquisition: “One of our first conversations will be how we can share EveryBlock data with local media partners. Our plan is not to compete with the local news ecosystem, but identify ways to reinforce it. After all, data complements coverage.”
Tags: Media & Journalism, EveryBlock, msnbc
Msnbc.com Acquires EveryBlock… Welcome Brother!
Mike Industries | August 17, 2009
Ryan Sholin says: Here’s Mike Davidson of Newsvine — acquired by MSNBC a ways back — on the EveryBlock news: “The organizations that succeed in local news will be the ones who respect all of the great journalism and increasingly available data in cities and neighborhoods across the world while creating better ways for people to consume it.”
Tags: Media & Journalism, EveryBlock, msnbc, Newsvine, Mike Davidson, Technology
YCRFS 1: The Future of Journalism
ycombinator.com | August 17, 2009
Ryan Sholin says: The first YCombinator “request for startups” asks: “What would a content site look like if you started from how to make money—as print media once did—instead of taking a particular form of journalism as a given and treating how to make money from it as an afterthought?”
Tags: Media & Journalism, ycombinator, journalism, newspapers, Technology
Y Combinator Starts Seeding Ideas To Startups
Ryan Sholin says: MG Siegler pens the TechCrunch post on Y Combinator’s new “Requests for Startups” including the first one, on the future of journalism: “This RFS is just the first of 3 to 5 that Y Combinator hopes to get out there before the October 26 Winter 2010 class application deadline, Graham tells us. Startups applying specifically for these RFS ideas will be able to indicate that on their applications.”
Tags: Media & Journalism, Technology, startups, funding, Business, ycombinator
Y Combinator’s “request for startups” in journalism
Wordyard | August 16, 2009
Ryan Sholin says: Scott Rosenberg on the “future of journalism” request for ideas from Y Combinator: “Graham’s challenge is elegantly simple: Instead of starting with the journalism and then puzzling out how to support it, start with the plan for revenue, then figure out what journalism might complement it. Recognize that the realm where innovation is most needed is the business side and how it relates to the journalism.”
Tags: Media & Journalism, ycombinator, startups, journalism, Technology, business model, Business, Scott Rosenberg
Tr.im to Go Open Source, Community Owned
ReadWriteWeb | August 17, 2009
Ryan Sholin says: Now tr.im is going to open-source their code, open their data, and give away their domain to a nonprofit to be named later? Sounds great. Let’s see what happens next.
Tags: Technology, tr.im, URL shorteners
VIDEO: The Secret Behind The Real-Time Web
rosstmiller on YouTube | August 13, 2009
Ryan Sholin says: In this video, FriendFeed (comically) reveals the secret little orderly process that keeps updates flowing through their network in real-time. A little industrial for my tastes, and proponents of the DRY principle in programming might throw up in their mouths a little bit. (Spotted via ReadWriteWeb.)
Tags: Technology, FriendFeed, video, legos
mathewi: Real-time reaction to FB/ @Friendfeed deal at http://friendfeed.com/bret [and at @scobleizer's page: http://bit.ly/B1c6C] via @digiphile
Twitter | August 10, 2009
Scott Karp says: Meta FriendFeed acquisition.
mediatwit: Quick thought: What if Facebook is just buying FriendFeed to kill a potential competitor? Wonder if they’ll integrate it, kill FF site.
Twitter | August 10, 2009
Scott Karp says: Good question.
dangillmor: Facebook buys FriendFeed, combining two of the most popular social networking sites i rarely use
Twitter | August 10, 2009
kleinmatic: tr.im’s collapse will have a more obvious and lasting effect than Facebook/Friendfeed.
Twitter | August 10, 2009
The Briefing: Who’s going to save your URL shortener from extinction?
Publishing 2.0 | August 10, 2009
Everything you need to know about the death of tr.im and the issue with URL shorteners but were afraid to ask. First draft of new Publishing 2.0 blog feature (this post is another first draft).
Bloglines On Life Support. This Story Needs An Ending
TechCrunch | August 10, 2009
Scott Karp says: Is RSS dead (re: Bloglines)? I don’t think it is, but who can resist “dead” memes?
Recession: Why Ad Industry Won’t Recover in Second Half
AdAge | August 10, 2009
Scott Karp says: Online and PR are “pockets” of strength in an otherwise bleak advertising forecast
USAA Bank Will Let Customers Deposit Checks by iPhone
New York Times | August 10, 2009
Scott Karp says: iPhone helping to kill another scourge of the paper-based world — physical check deposits.
ianbetteridge: The last company to try and control 3rd party software as Apple does on the iPhone was IBM with its mainframes. And we know how that ended.
Twitter | August 10, 2009
Scott Karp says: But the iPhone is just a wee bit cooler than the IBM mainframe. And it’s consumer hardware.
carr2n: wake up call. @BradStone writes that you probably didn’t have your coffee before you checked this tweet: http://bit.ly/e2qGt
Twitter | August 10, 2009
Scott Karp says: For more and more people, the web has replaced newspapers as the first media they consume in the morning.
Yesterday, URL shortener tr.im announced that they’re shutting down.
Why? What do you need to know about it? What’s going to happen as bit.ly swoops in to the (attempted) rescue? Are we too dependent on services like tr.im to tie the social Web together?
Ten links to answer your questions:
tr.im | August 9, 2009
Ryan Sholin says: From tr.im’s official blog post on their demise: “And, the data that tr.im generates — the hottest links that people are sharing right now — is all well and good, but everyone has this data. tr.im gets hit by countless bots every day farming this data to create and operate websites such as tweetmeme.com. So, *everyone* has this data, meaning it is basically worthless *by itself* to base a business on (as bit.ly and others are attempting to do) at least in our humble opinions.”
Tr.im URL Shortener Shuts Down; Short Links to Die?
Ryan Sholin says: tr.im dies, says there’s no way to monetize URL shortening. Well, of course not, if that’s all you do. The first-wave URL shorteners will be replaced by shorteners that are just secondary features of other apps. See also: Diggbar, Su.pr, HootSuite, etc.
zseward: What’s that expression? You never want to outlive your URL shortener http://tr.im
Ryan Sholin says: Zach Seward posted one of the first tweets I’ve been able to find noting tr.im’s untimely demise.
URL shortener Tr.im’s demise: Social web is built on house of cards
VentureBeat | August 10, 2009
Ryan Sholin says: Matt Marshall weighs in: “In other words, the rules of the social web are still being made up on the fly, and if you run a Web business, or are dependent on the Web for traffic, you should beware of the risk in relying on things like URL shorteners. One trick: Build your own URL shortening service.”
Twitter’s platform shortcomings
Scobleizer | August 10, 2009
Ryan Sholin says: Robert Scoble enumerates Twitter’s shortcomings on the occasion of tr.im’s collapse: “5. Twitter has built a system that relies on a third party for functionality. Even now, if we use bit.ly links like Twitter recommends, there’s no guarantee that Twitter will keep those links working in the future if Bit.ly’s investors decide it can’t make money. Since money has NOT started flowing through the Twitter system yet we’re all wondering just how Bit.ly will make money…”
Bit.ly Wants To Make Money With A News Service; But Will Anyone Pay For It?
paidContent.org | July 31, 2009
Ryan Sholin says: Tameka Kee at PaidContent wrote this about bit.ly just a few days ago: “We’ve suggested a premium subscription service, where media companies and other heavy users would pay for advanced analytics, since bit.ly currently lets people track the number of clicks their links get and where their traffic’s coming from for free. In an interview with Wired, bit.ly General Manager Andrew Cohen acknowledged that the startup was thinking about charging for more robust data access, but also about creating a real-time news service that tracked breaking and popular stories.”
An Oversized Ruckus About Tiny Web Addresses: Bit.ly’s Bigfoot Offer to the Rest of the Business
All Things Digital | August 10, 2009
Ryan Sholin says: Peter Kafka on bit.ly’s proposed solution to play Internet Archive for short URLs: “To me, that sounds a bit like a mafia don shaking his head a tad wistfully after hearing that one his old rivals got bumped off, then sending a big bouquet to the funeral. And I think that the tr.im team, as well as some of bit.ly’s other competitors, may take it in the same vein.”
zeldman.com | August 10, 2009
Ryan Sholin says: Zeldman on URL shorteners: “Rolling your own mini-URLs lessens the chance that your carefully cultivated links will rot if the third-party URL shortening site goes down or goes out of business, as is happening to tr.im, a URL shortener that is pulling the plug because it could neither monetize nor sell its service.” (Note the link to an excellent WordPress plugin for short URLs deep in this post.)
VIDEO: tr.im – the best URL shortener!?
YouTube | August 10, 2009
Ryan Sholin says: A three-month-old screencast review of tr.im’s features, which may serve as a useful archive of what the service offered as its users look for a substitute.
[Note: The links in this post were curated with Publish2.]
Journalists are news companies’ most valuable assets.
That’s what Mike Arrington asserts, and I think he’s right (disregard the “failing old media” rhetoric):
And earlier today I got a glimpse at what AOL is up to – they are hiring all the journalists being fired and laid off by the newspapers and magazines. And they now have a news room 1,500 journalists and editors strong. Amazingly, failing old media is throwing away their most valuable assets. And AOL is eagerly picking those assets up for a song. Before anyone knows it, AOL may be the most powerful news outlet in the world.
Given that NYT has gone to great lengths to avoid newsroom layoffs, I suspect they know full well how valuable their journalists are.
Mike Arrington is TechCrunch’s most valuable asset, for his personal brand and for the quality of the post he writes.
As Arrington points out, AOL CEO Tim Armstrong has also realized how valuable journalists are, and is aligning AOL’s new strategy with cornering the market for journalist talent.
But is Arrington right that media companies are blithely throwing away their most valuable asset? Why did newspapers make so many newsroom cuts on their path back to profitability? Is it because they don’t recognize the value of their journalists?
I think it’s because they are still wrestling with the declining value of their other major asset: industrial printing and distribution capacity, i.e. printing presses and delivery trucks and all their industrial staff. While some newspapers have made significant cuts to their industrial operation by not delivering or publishing everyday (and a few have taken the extreme step of ending their industrial operation entirely), most have protected this asset because it is not really variable — it’s mostly all or nothing.
But to say that the value of industrial printing and distribution capacity is declining is not to say it has no value — it of course still generates most of newspaper company revenues. But the decline, while exacerbated to a large degree by the recession, is still secular long-term. (And newspaper companies are surely using the breathing room they achieved through cost reduction-driven profitability to figure out their long-term strategies — and they are focused on digital.)
AOL, in contrast, has no industrial assets, so has the latitude to invest in journalists. They also have another huge asset that newspapers enjoyed in their geographic distribution areas that they entirely lack on the web: SCALE
A notable illustration of the shifting value of news company assets that sits between AOL and most newspaper companies is Politico.
Politico rose to prominence by showcasing its high profile journalists on its website.
Unlike most news sites, Politico has real profile pages for its journalists and showcases their bylines on every story (even the lead homepage story):
This doesn’t mean, however, that Politico derives no value from industrial printing and distribution. In fact, half of their $15 million in annual revenue comes from a print edition published three days a week when congress is in session, and once week otherwise (via Vanity Fair).
But Politico doesn’t own any printing presses or delivery trucks, i.e. no industrial assets. And the print publication is largely the product of content produced first for the web — and it is very much a “nichepaper,” i.e. it targets the highly valuable audience of Capitol Hill staffers and members of Congress.
The results is that Politico is able to invest in a talented newsroom staff of 100, paying nearly as much as The Washington Post. And Politico is profitable.
But does focusing on journalists as news companies’ most valuable asset mean that news companies should be exclusively in the content production business? That’s a significant shift from the industrial printing and distribution business.
In the digital media world, companies like Google and Apple have taken over, as Columbia J School Dean and former WSJ.com managing editor Bill Grueskin put it, the “profitable front end of the distribution chain,” leaving news companies with the much less profitable back end of the value chain (i.e. content creation).
But what if journalists could also be the key to news companies getting back into the distribution business, in digital media?
The greatest asset of Google, the most successful content distribution business on the web, is its ability to harness the judgment of every person who creates a hyperlink on the web, and to know which links from which sites represent more trusted judgment.
News companies still employ in their newsrooms arguably the greatest collective source of news judgment.
So how can news companies leverage the asset of their journalists’ news judgment?
Hint #1: Collaboration
Hint #2: Scale
News companies are notably trying to figure out how to get into the business of charging for content on the web. As Apple’s iTunes demonstrated, the key to charging for content is in effective and highly convenient packaging.
Could journalists be the key to not only creating the content but also packaging it?
Think about that for a while. More in another post.
The New York Times technology blog, Bits, which features original online reporting by all of the NYT technology journalists, has formally launched a new feature called “What We’re Reading.” This feature (powered by Publish2) illustrates a number of important best practices for how journalists and news orgs can create significant value for readers by curating the web. I’ve got six of them for you.
But first, here’s what the feature looks like, in the blog’s right sidebar, under the ad at the top (click for larger image):
And here are the six best practices:
1. Make it a collaborative effort.
With all that journalists are being asked to do on the web, it’s not ideal from a workflow perspective for one reporter or editor to carry the burden of curating the web. Bit’s “What We’re Reading,” like the blog, is a collaborative effort of NYT technology journalists as a group:
Here are all of the NYT technology journalists contributing to What We’re Reading (via a Publish2 newsgroup, designed to enable this type of collaboration):
The Bits blog is “aiming to identify a dozen or so items every weekday,” which is much easier with a dozen contributors than with one.
2. Comment to explain why the link is worth clicking.
Next to search, the greatest driver of traffic on the web is social recommendations, i.e. person-to-person recommendations. TechCrunch, for example, now gets nearly 10% of its traffic from Twitter, which is simply people recommending links to each other.
When people recommend something to each other, they typically say something about what they are recommending. This distinguishes personal recommendations from machine recommendations — algorithms can automatically pull the lede, but they can’t tell you what they think or highlight what’s interesting about a story.
NYT tech journalists make the What We’re Reading feature much more valuable — and differentiated from headlines produced by algorithms — by adding a comment to every link:
As New York Times deputy technology editor Vindu Goel observes, “readers should know why you are recommending a certain item so they can decide whether it’s worth their time to check out.”
You can think of it as a mini-blog, since blogging grew out of sharing links along with what you think about what you’re linking to.
3. Attribute links to individual journalists.
This best practice follows from commenting on each link. People click on links in Twitter, in Facebook, or in email based on WHO recommended it to them. Blogs on news sites are a great way for journalists to build up their personal brands — sharing what they’re reading is an extension of that.
4. Share links on Twitter.
Speaking of Twitter, NYT Bits journalists also share what they’re reading with Bits’ 6,700 followers on Twitter (automatically through the Publish2 newsgroup). Sharing links was Bits’ first foray beyond what most news orgs do with a Twitter account, i.e. auto-post their own headlines, and it’s a significant enhancement to the value of their Twitter feed.
One of the easiest ways for a news org to enhance its Twitter feed — and be more like individuals with lots of Twitter followers — is to share links to interesting things. It’s a fundamentally social practice.
Following best practice #2 each link shared on Twitter has the journalist’s comment (rather than the link’s title), and following best practice #3, each has the journalist’s initials after the comment, e.g. ^SH is Saul Hansell (again, done automatically via Publish2 newsgroup).
5. Integrate into existing workflow.
The What We’re Reading feature is well named, both as a simple description for readers and as a literal description of the workflow behind it. As Vindu observed: “As journalists, we’re constantly looking at news coverage, blogs and Web sites. Why not share the most interesting stuff we find with our readers?”
And the time required?
Vindu: “Less than a minute, which was really important to us. Publish2 worked with us to configure the selection tool to automatically include a lot of the key information, such as our Twitter feed. So when I find an item I want to share, I click on a button in my Web browser, edit the headline of the linked article if necessary (sometimes they are really long or incomplete), add a public comment and hit “Save” to send it out to the world.”
The link is automatically added to What We’re Reading via Publish2. No need to log in to a CMS or even leave the page that they’re reading.
Previously, interesting items that NYT technology journalists came across that didn’t make it into a NYT print article or a Bits blog post ended up on the “cutting room floor.” The only change to workflow is they are now sharing those interesting items with their readers — on the blog and on Twitter in one step.
6. Complement original reporting.
What We’re Reading, positioned high in the right sidebar, serves as a perfect complement to the original reporting in the main blog on the left. Curating the web, like blogging, should be a fundamental skill of every journalist who wants to create value in a web media world.
Vindu: “There’s is a lot of great information out there on the Web that isn’t produced by The Times. Our overarching goal as a news institution is to inform our readers. Often that’s with outside content. So What We’re Reading is part of a broader effort by The Times to feature strong third-party content on our site. For example, we have modules on our Technology home page, www.nytimes.com/technology, that show stories from respected tech blogs such as ReadWriteWeb and GigaOM.”
Lastly: Should Bits fear “sending readers away”? No more than Google or Drudge Report should. Do a great job sending readers to interesting content on the web, and they’ll keep coming back for more.
New York venture capitalist Fred Wilson is one of the most prolific and renown bloggers on the web. And if you go his blog, avc.com, you’ll notice that (like most blogs) he runs advertising to generate revenues. But what many of you may not know is that all the proceeds Fred generates through his blog goes to charity. What a concept!! You blog for a few minutes each day, and presto! You’re supporting your favorite charity! Now, imagine if millions of people did this… imagine the impact we could have on the world.
Starting today, if you’re a blogger who uses Wordpress, (both hosted .com as well as .org) you can do precisely that. Through a newly-launched partnership, Wordpress and SocialVibe (disclosure: I am on the board) are introducing a widget that will enable millions of bloggers that use Wordpress to support their charities of choice.
With the SocialVibe widget, bloggers can donate real money to their charities without the need to dip into their own pockets. Instead, the money is generated from brand advertisers that the bloggers self-select as the sponsor (e.g. Showtime, Sprint, Colgate, Kraft Foods, etc.). To be more specific, once bloggers install the SocialVibe sidebar widget on their Wordpress blog, money will be earned for charities every time readers engage with the widget (e.g. rating a Showtime video clip). Bloggers can switch their cause and sponsor as often as they like, and receive regular updates from their charity about goal progress and impact.
Thus far, SocialVibe has enabled people to raise close to half a million dollars for charities in just over a year’s time. Everyday, members are sharing their brand sponsors with millions of friends on social platforms such as MySpace, Facebook, and now WordPress to benefit one of more than 30 non-profit organizations, such as World Food Program, Children’s Miracle Network and charity: water.
What makes this partnership especially interesting is that WordPress has, up to this point, restricted any advertising on hosted accounts (with the exception of VIP accounts). In the past they have expressed concern over advertising’s impact on spam and motivation for expression. While these concerns no doubt still exist, there are a few facets of the SocialVibe platform that may make the advertising program more palatable:
• The blogger can choose to engage with a brand partner or not.
• The benefit to the blogger comes not in $ dollars (or a check), but rather in the form of a donation to a charity.
• The ad unit, with charity graphics and links to relevant information, adds to the content of the site, rather than detracting from the experience as most advertising programs do.
So it is possible that the SocialVibe widget will motivate bloggers to create even better content and engage a larger audience, knowing that they now have a way to pool their individual influences to create positive change in the world. And it’s important to realize that the Wordpress-SocialVibe partnership is designed to align such altruistic desires with the many corporations and brands that increasingly value social responsibility. For the brands involved, this platform provides a golden opportunity to get unique endorsements in a highly engaging manner within social media. It’s a win for all parties involved.
For more information about the SocialVibe-WordPress widget, check out the WordPress blog post.
Originally posted at BeatBlogging.org, a resource for journalists using social networks, blogs, and other Web tools to improve beat reporting.
Whenever I talk with news organizations of any size about linking to sources, resources and journalism that originated outside the walls of their newsroom, two questions come up: How and Why.
Well, conveniently enough, I work for Publish2, and we build tools that help answer the question of How. If your problem is that systems make adding links directly in the text of your story a difficult task, let’s solve that by adding links in widgets, sidebars, scrolling across the bottom of the browser window, blinking in 96pt red Helvetica, pushed to Twitter — wherever and however you want them.
My standing offer on How is that if the question comes up, you can talk to me and I’ll help you out.
So back to the question of Why.
Why we link: Five reasons your news organization should tie the Web together
1. Because we owe it to our readers to give them as much information as we have at our fingertips.
Don’t we? Of course we do.
If you’re a journalist, a huge part of your job is to filter all the information relevant to your community or your beat and pass along the important parts to your readers. Think about all the press releases you get by fax or e-mail, all the phone calls, voicemail, and messages that land on your desk, and think about how you act as a filter for that flood of information. Do the same thing with the Web.
Bring your readers the best links related to your story, and they will thank you. How? By treating you like a first-class citizen of the Internet, and coming back to your news site, which is no longer a dead end backwater in the river of news, but a point of connection where they can find other interesting streams.
Chris Amico took it one step further in a tip he submitted via the Publish2 Collaborative Reporting form I used to gather some ideas for this post. “Humility is healthy,” Chris wrote. “The more we get out of this mindset that we are the sole producers of useful content, the better off we’ll be in the long run.”
2. Because linking to sources and resources is the key gesture to being a citizen of the Web and not just a product on the Web.
You might think your news organization is super-duper-Web-savvy because you put your stories online, have RSS feeds and push links to your own content out via social networks, including Twitter.
That’s Step One. And it’s a good first step.
But, if all you provide your readers is flat content that doesn’t take them anywhere else on the Web, or back up statements with direct sources, or provide resources for those who want to explore a topic beyond what you’ve been able to provide with original reporting, you’re just shoveling text into another bucket, one labeled “Web.”
If, on the other hand, you want to embrace the traits that make blogs, Twitter, and so many other online communication tools a vital part of the daily life of your readers, your news site shouldn’t feel like an endpoint in the conversation. It should feel like the beginning.
Asteris Masouras put it this way in a Twitter reply to my query about why we link:
3. Because it’s the best way to connect directly with the online community in our town.
If you’re writing about human beings, businesses, organizations, government institutions or any other life form with a presence on the Internet, linking to them in the stories you publish about them is the low-hanging fruit when it comes to participating in your local online community.
Skipping the link to the city council’s calendar when you mention the next meeting, leaving out the link to the Little League’s online scoreboard when you write a story about its resurgence or not bothering to link to the full database of restaurant inspections when you choose three to write about — these are all easy ways to miss an opportunity to connect with your community and your readers.
Start simple: If you mention a person or organization, link to them.
Many, many bonus points to be awarded if you dig deep enough into the local online community to link to relevant content created by the people in your story. Did that angry neighbor’s crusade for a new zoning law to govern branches that hang over someone else’s driveway start with an image posted to a photo-sharing site and a determined comment? Link to it.
There’s a huge upside to linking out to community members, of course. Sometimes they link back.
Wenatchee World Web Editor Brianne Pruitt dropped a tip in my form including the following statement: “The link economy is real, and important for anyone who wants to be a part of the Web ecology.” I’d translate that as: Give some, get some.
And here’s how Web developer Pete Karl answered the question of why news organizations should link to external sources:
4. Because we absolutely do not know everything, but we know where to find out most of what we don’t know.
The days of your news organization existing as a monopolistic source of local information are over, and your readers know it. They browse local, national, international, and topical news and commentary in more places than you call “news.” And if they don’t, they hear it from their friends on any one of a dozen social networks. They know that you don’t know it all. And so do you.
But you’re the journalist.
You’re the filter. You’re the person in town who knows everyone who knows everyone. You’ve got the sources, whether they’re people you talk to at the community center, the city council meeting, the police station, or their Live Journal page. Bring what they know to your readers as directly as possible: Link to them.
5. Because it will make your job easier.
I know, I know. Everyone is asking you to do more with less. It’s extremely easy to tell people like me that you just don’t have time for another toy, another tool, another camera, another social network or another task.
I’m here to tell you that bringing your readers the best of the Web can save you work.
How? By opening a two-way channel to let your readers tell you what you should link to next, you’ll cut down on the time you spend looking for that next thing. By maintaining a real presence in the local link economy, you’ll make it easier for sources who know the answers to your questions to find you, and you won’t spend as much time trying to find them.
By sending your readers to the best information available on the Web, you’ll keep them coming back for more, drawing more traffic to your news site. Last time I checked, more traffic is one way to make more money, and with any luck, that’s still how you get paid.
Bonus Links on Links:
- Josh Korr, my colleague at Publish2, explores what happens when a group of news organizations collaborate to curate links when regional news breaks
- David Cohn from Spot.Us asks whether bookmarking links using social news services is an act of journalism
- Jeff Jarvis explores the ethic of the link economy
Thanks to everyone who replied on Twitter or in the Publish2 Tip Form when I asked for some of the best reasons to link out from your news site.
If the wire editor and feature editor roles are becoming obsolete for print newspapers, as Steve Yelvington persuasively argues, then those editors should be retrained — or retrain themselves — as web curators. Rather than become obsolete, these editors could become essential to their news organization’s future on the web.
On the Internet, we have no need of wire editors; if we wish to have wire content on our websites, we can plug in AP Hosted News, or run a full feed of AP Online or some similar product from another service. But with everything on the Internet just a click away, the value of such branded and hosted wire content is low (and measurable), and even that may go away before long, based on simple cost-benefit analysis. We may be better off sending users to CNN, MSNBC and NYtimes.
Feature editing faces the same problem:
But the job simply doesn’t transport to digital media. Again, everything on the planet is just a click away, much of it more interesting, entertaining and informative than can be found in the typical daily newspaper’s features.
Yet there is a HUGE opportunity in this shifting landscape. Just because there’s a wealth of content a click away doesn’t mean that news consumers know where to click in order to find it.
Instead, we have what Clay Shirky describes as “filter failure”:
Here’s what the Internet did: it introduced, for the first time, post-Gutenberg economics. The cost of producing anything by anyone has fallen through the floor. And so there’s no economic logic that says that you have to filter for quality before you publish…The filter for quality is now way downstream of the site of production.
What we’re dealing with now is not the problem of information overload, because we’re always dealing (and always have been dealing) with information overload…Thinking about information overload isn’t accurately describing the problem; thinking about filter failure is.
Local news sites may serve their readers much better by sending them to CNN, MSNBC, and NYT for non-local news, as Steve suggests. But they may also send them to local news sites in other regions for stories dealing with common issues. They may send them to local blogs and other non-MSM media sites.
There is a wealth of sources on the web. Helping readers find the best of the web could help local news sites remain daily destinations rather than just a host for content to be aggregated by someone else — which could help those news operations get back into the content distribution business, which is how they made money in print, and how they could make a lot more money on the web.
Wire and feature editors are already skilled content curators — they just need to adapt those skills to filtering the web. One challenge they can apply their news judgment to is discovering new sources of trusted information, something Google CEO Eric Schmidt admits alogirthms struggle with.
For general search, we’ve been careful not to bias it using our own judgment of trust because we’re never sure if we get it right. So we use complicated ranking signals, as they’re called, to determine rank and relevance. And we change them periodically, which drives everybody crazy, as or algorithms get better. There’s no question in my experience that the top brands represented in this room would, in fact, float to the top in our search ranking. The usual problem is you’ve got somebody who really is very trustworthy but they’re not as well-known and they compete against people who are better known, and they don’t, in their view, get high enough ranking. We have not come up with a way to algorithmically handle that in a coherent way.
Another skill that would help wire and feature editors take on the challenge of filtering the web, and make them hugely relevant in the web media era, is collaboration. They could learn a lot from the editors in Washington State who have been practicing collaborative curation, whether for a statewide flood or a flu outbreak.
Publish2’s Senior Editor Josh Korr wrote about this vision for re-inventing the wire function on the web in a recent Nieman Reports piece, “A 21st Century Newswire—Curating the Web With Links”
If I were a wire or feature editor in a newsroom, instead of waiting to become obsolete, I would start immediately learning how to be a top notch web curator. I’d ask myself — how can I become the Jim Romenesko or Matt Drudge for my community. I would start learning how to use the tools of web curation and learning how to collaborate with other web curators. I’d study how newsrooms like Chicago Tribune have created an editorial workflow for collaborating to curate the web (see Colonel Tribune Recommends on the Chicago Breaking News blog.)
And if I ran a newsroom, I’d look at how I could retrain and reassign talented to editors to be vital contributors to the web operation, even as their function becomes redundant for the print operation. (Or, I’d imagine a future where content from a diverse range of web sources could be licensed and curated for print — see this Josh Korr post.)
There’s still time for any journalist in the newsroom to become essential to the future of news, rather than being emblematic of the past.
Perhaps you’ve noticed a bit of activity online the last few days related to a certain not-quite-pandemic bug that’s going around.
Or, to put it in microblogging terms, #swineflu.
The wonderful thing about the ease of communication online is that anyone can start a discussion, carry it on, pass along information, retweet it, forward an e-mail, leave a comment on a blog post, or bookmark a page in a social way.
The problem, of course, is that when millions of people are desperately looking for solid, clear information, that’s when it can be the most difficult to find it.
The #swineflu hashtag on Twitter serves as a good point of reference for what Clay Shirky called “filter failure.” The problem is not that there’s a wild abundance of useful information, overloading us with detail, facts, and commentary; the problem is that we don’t have the proper filtering system set up to separate trusted sources and reliable resources from rumors, jokes, misinformation, and ephemera. If those seeking to provide links to reliable information started using a hashtag such as #therealswineflu, it would likely be overtaken — quickly — by tagged content with less value, whatever its source.
So how do we solve filter failure?
We depend on humans to serve as our filters. We do this all the time, when we ask a friend a question, or talk with someone we know who happens to be an expert on a given topic. (I imagine the world’s epidemeologists are fielding a huge number of Facebook messages from old friends this week.)
When it comes to reliable sources for news that breaks on a massive scale, our best sources are likely to be Wikipedia for facts, and journalists for explanation, clarification, context, and meaningful analysis.
One way that journalists are bringing explanation, clarification, context, and meaningful analysis of the Swine Flu story to their readers today is through Publish2.
We’ve been actively encouraging journalists and news organizations using Publish2 to use the swineflu tag to mark the stories and resources they’re saving to help their readers understand what they need to know about this outbreak, put it in context, and quickly respond to it as necessary.
Look for the “Create Widget” link near the top of every topic page on the Publish2 site to embed any stream of links on your own news site or blog. The Knoxville News-Sentinel added the stream of all Publish2 links tagged with “swineflu” on a page to gather resources from around the Web.
So that’s one filter: Journalists sharing reliable information to serve their readers.
Want to drill down a little more? How about a regional group of journalists at different news organizations gathering information in a collaborative effort to serve local readers?
In the Pacific Northwest, #wanews has you covered. This group of reporters and editors in and around Washington State first came together to use Publish2 to aggregate news and information when flooding hit the area earlier this year.
It’s an innovative group, and this week has been no exception, as they’ve jumped in to form a Publish2 newsgroup where anyone they invite can post links and then embed the stream as a widget on their own site:
- The Wenatchee World embedded a “Northwest news links” stream in the sidebar of their stories about Swine Flu.
- The Walla Walla Union-Bulletin added the feed of links to a special topic page where they’re providing Swine Flu information to their readers.
- The Kitsap Sun did both, with a sidebar on story pages and a stream embedded on a topic page as well.
Solving the problem of filter failure isn’t a small task.
Want to help out? Register for Publish2 if you’re a journalist who wants to pitch in by bringing the best of the Web to your readers.
Today we’re announcing three major additions to the Publish2 team — journalists whose stellar reputations speak for themselves:
- Ryan Sholin joins us next week as Director of News Innovation.
- Greg Linch is the winner of the Publish2 Future of Journalism Contest and will join us in the fall as our Producer.
- Howard Weaver has joined our Board of Directors.
Get the full scoop at the Publish2 Blog.
There is so much misunderstanding flying around about the economics of content on the web and the role of Google in the web’s content economy that it’s making my head hurt. So let’s see if we can straighten things out.
Google isn’t stealing content from newspapers and other media companies. It’s stealing their control over distribution, which has always been the engine of profits in media. Google makes more money than any other media company on the web because it has near monopoly control over content distribution (i.e. like a metro newspaper in the pre web era).
Those who argue that Google is a friend to content owners because it sends them traffic overlook the basic law of supply and demand. The value of “traffic” is entirely relative. The more content there is on the web, the less value that content has — because of the surfeit of ad inventory and abundance of free alternatives to paid content — and thus the less value “traffic” has.
The more content there is on the web, the less money every content creator makes, and the more money Google makes by taking a piece of that transaction.
Nick Carr sums up the problem well:
What Google doesn’t mention is that the billions of clicks and the millions of ad dollars are so fragmented among so many thousands of sites that no one site earns enough to have a decent online business. Where the real money ends up is at the one point in the system where traffic is concentrated: the Google search engine. Google’s overriding interest is to (a) maximize the amount and velocity of the traffic flowing through the web and (b) ensure that as large a percentage of that traffic as possible goes through its search engine and is exposed to its ads.
The debate over whether Google’s excerpting content on its search result pages is a violation of copyright law, i.e. whether Google is effectively stealing content, overlooks the much more valuable asset that Google is appropriating. Google makes money less by its ability to display that snipet of content and much more by its ability to know that snipet of content is relevant to what the content consumer is looking for — it makes money by its ability to efficiently distribute that content.
And just how does Google know what content is most relevant, trustworthy, and valuable? How does Google know where to send the traffic that yields such diminishing returns?
Everyone talks about Google’s algorithms as if it were some giant artificial intelligence that had its own ability to judge the value of content.
The greatest irony of the web content economy is that Google by itself doesn’t have a clue what content is good or bad. Google is able to deliver relevant search results only because every site on the web helps them figure it out.
Google’s algorithm is based on reading “links” as votes for content. Every time a website links to another website, Google reads that link as a vote. The brilliance of the Google algorithm is its ability to figure out which votes should count more.
But without those links, without those “votes,” Google has nothing.
What Google “steals” from every website isn’t the content — it’s the links.
It’s the links, stupid. And everyone gives Google their links to read — for free!!
Google doesn’t really need your content, because there’s plenty more where it came from. What Google really needs is your links, i.e. your votes for content — it needs your help separating the wheat from the chaff on the web.
The key to Google’s monopoly control over content distribution on the web is its ability to judge what’s most relevant in an increasingly large sea of content.
If media companies want to compete with Google, they need to look at the source of its power — judging good content, which enables Google to be the most efficient and effective distributor of content. They also need to look at Google’s fundamental limitation — its judgment is dependent on OTHER people expressing their judgment of content in the form of links. Above all, they need to look at sources of content judgment that Google currently can’t access, because they are not yet expressed as links on the web.
The balance of power on the web can shift — but only by understanding what the real sources of power are.
Just to clarify, the use of “steal” and “stole’ is in the sense of “stole the game.” The point of this post is to explain how Google won, and not at all to suggest that they didn’t deserve to win. Google’s success is a direct reflection of how much value they create, i.e. A LOT — they solved a problem in the market that nobody else figure out how to solve or even recognized as the huge opportunity in the market. This post is also intended to help media companies understand better how Google works so that they can better compete in the web content marketplace, not to justify any feelings of “sour grapes.”
The Seattle Post-Intelligencer today because the first major metro newspaper to stop publishing in print but keep the news brand alive on the web. Seattlepi.com’s Executive Editor Michelle Nicolosi promises bold experiments, “to break a lot of rules that newspaper Web sites stick to.” And to be sure, the entire news industry will be watching to see what an editorial staff of 20 can accomplish compared to a staff of 165. (Given their intent to look “everywhere for efficiencies” — and that they won’t have “reporters, editors or producers—everyone will do and be everything” — I suspect they will accomplish more than most people think.)
But in addition to the key editorial question, Seattle has also now become a test case for one of the most important questions about the near-term future of the newspaper industry that is almost never asked:
What will happen to the print advertising when the newspaper stops publishing in print?
I asked this question a few months ago in theory, but now we get to see what happens in actuality. Logically, one or a combination of the following will happen to the newspaper’s advertising dollars:
- Vaporizes, i.e. the advertiser stops spending the money — given the economic crisis, this seems likely for some advertisers
- Shifts to Seattlepi.com — which is hiring its own sales force following the dissolution of the joint operating agreement with the Seattle Times
- Shifts to another newspaper, i.e. Seattle Times — through the JOA, the same sales force sold ads for Seattle PI and Seattle Times, so it only makes sense that some advertisers will shift some or all of their spending to the Times
- Shifts to competing local online media, e.g. The Stranger, West Seattle Blog
- Shifts to non-local media that can target local audiences, e.g. Google, Craigslist
Anyone who runs a newspaper should be watching this experiment under a microscope. Someone should even go so far as to obtain copies of the last month of Seattle PI in print and call up every display advertiser and ask them what they plan to do.
This experiment has already been playing out in Denver since the Rocky Mountain News ceased publication, but since the Rocky ceased entirely, we didn’t get to see what happened with option #2 above — and that’s the BIG question for many newspaper companies looking at online-only publishing. (The experiment in Denver could be radically altered if a new publication is launched by former Rocky staff — it’s contingent on whether they can sell enough subscriptions, which I hope they do because that is another vital experiment.)
So much of the discussion of the future of the newspaper business seemes to assume only option #1 above will occur. But that’s unlikely.
Of course the big question is whether local media can find new ways to create value — and I say “create value” because that is the key to any new business (vs. “new business model,” because those discussions typically start with what the business needs, not what the market needs). I think there are tremendous opportunities for new value creation in emerging collaborative media ecosystems, but that’s for another post.
In the meantime, all eyes are on Seattle.