There has been a lot of discussion in recent times about the impact that weblogs are having in Google’s searches. Scoble writes about how there is apparent pressure from advertisers to downgrade bloggers because as Geoffrey Nunberg puts it (writing in the NYTimes) “Google now conducts 55 percent of all searches on the World Wide Web. People have come to trust the service to act as a digital bloodhound. Give it a search term to sniff, and it disappears into the cyber wilderness, returning a fraction of a second later with the site you were looking for in its mouth. A high place in Google’s rankings can have a considerable value for commercial sites. Some go so far as to pay other sites to link to them to raise their standing.”
Doc Searls has a alternate idea:
What would happen if the archives of all the print publications out there were open to the Web, linkable by anybody, and crawlable by Google’s bots? Would the density of blogs “above the fold” (on page one) of Google searches go down while hard copy sources go up? I’ll betcha it would.
My point: Maybe this isn’t about “gaming” algorithms, but rather about a situation where one particular type of highly numerous journal has entirely exposed archives while less common (though perhaps on the whole more authoritative) others do not.
This is a point also echoed by Joi Ito: “If the big print media put their archives online and made them crawlable and linkable, I bet their page rankings would go up. It’s really the links between the archives of the blogs that gives blogs so many links. The solution to googlewashing is probably more about getting other forms of journalism published in a more link-friendly way than filtering the blogs.”
Dave Winer says it all: “Google is just indexing what’s on the Web. If you want to be in Google, you gotta be on the Web. It’s pretty simple.”