Monthly Archives: October 2006

The rebound that never came

In early 2003, in my article “IT Doesn’t Matter,” I had the temerity to suggest that companies should “spend less” on information technology, treating it as a cost of doing business rather than a means of gaining competitive advantage. The suggestion was roundly attacked by the tech industry, which reacted to the article with a kind of collective conniption fit. At the time, you’ll remember, IT spending was in the doldrums, with companies nursing their hangovers from the great IT investment binge of the 90s. The general assumption, among IT vendors in particular, was that the softness in spending was just a blip, that buyers would soon start ratcheting up their outlays again. “This is a cyclical event,” Tom Siebel confidently told the Wall Street Journal. A rebound was just around the corner.

Well, more than three years have passed, Tom Siebel’s company is kaput, and the rebound remains around the corner. It’s become clear that the slowdown in IT spending is not a passing cyclical event but a secular trend, a reflection of a basic change in the way companies view information technology. That fact was underscored last month when Information Week published the latest edition of its annual survey of IT spending among the “InformationWeek 500” – the companies that it identifies as being the most innovative users of IT. The survey reveals not only that IT budgets haven’t jumped since 2003, but that in fact they’ve continued to erode. Between 2003 and 2006, IT spending as a percentage of revenue has on average fallen from 3.66% to 3.21%. Of the 21 industry sectors tracked by Industry Week, only 5 saw in an increase in IT spending as a percentage of sales over the last three years. In absolute terms, IT expenditures have dropped as well, from an average of $353 million in 2003 to $304 million today. Of those shrinking budgets, moreover, the percentage devoted to purchases of new hardware and software slipped from 37% in 2003 to 34.5% today.

Remember, these are the most innovative users of technology, the ones that set the pace for everyone else.

What’s perhaps most revealing about the Info Week study is the trend it reveals in IT spending among IT vendors themselves. They’ve actually been reducing their IT outlays as a percentage of revenue even more aggressively than the average. The most innovative tech companies spent 4.4% of sales on IT in 2003, 4.0% in 2004, 3.5% in 2005, and just 3% in 2006. As the magazine reports, “The IT industry has been engaged in a multipronged attack on improving operational efficiencies. Not surprisingly, it’s a leader in the use of relatively new technologies such as multicore processors, server blades, and virtualization software to create more cost-effective deployment strategies.” The clear implication is that, as other companies begin to capitalize on these same advances, they, too, should be able to achieve even greater reductions in the amount of money they devote to information technology.

Earlier this month, Silicon.com asked a group of UK CIOs whether thay agreed with my contention that companies should be spending less on IT. Two-thirds of them thought I was right. One termed the idea “just common sense for businesses.” It’s not the first time that heresy has turned into dogma.

Trailer park computing

In a recent post on his blog, Sun CEO Jonathan Schwartz coyly hinted at a rethinking of the corporate data center. “Now I understand that IT infrastructure has to be put somewhere,” he wrote. “But the whole concept of a datacenter is a bit of an anachronism. We certainly don’t put power generators in precious city center real estate, or put them on pristine raised flooring with luxuriant environmentals, or surround them with glass and dramatic lighting to host tours for customers … Surely it’s time we all started revisiting some basic assumptions.”

It wasn’t hard to see that Schwartz had something up his sleeve.

Today, in addition to announcing an expanded push into data center virtualization, Sun is revealing that a year from now it plans to begin selling readymade data centers in shipping containers at a starting price of a half million bucks a pop. Designed by supercomputing genius Danny Hillis, the data-center-in-a-box will, Schwartz told the New York Times’s John Markoff, “be attractive to customers that need to expand computing capacity quickly.”

The container, designed to hold up to 245 server computers, can be plopped anywhere that has water and electricity hookups. “Once plugged in,” reports Markoff, “it requires just five minutes to be ready to run applications.”

Welcome to trailer park computing.

black boxThe containerized data center is one more manifestation of the fundamental shift that is transforming corporate computing – the shift from the Second Age client-server model of fragmented, custom-built computing components to the Third Age model of standardized, utility-class infrastructure. As this shift plays out, the center of corporate computing will move from the personal computer upstream to the data center. And, inevitably, what happened to the PC – standardization and commoditization – will happen to the data center as well. What is Sun’s data-center-in-a-box but an early example of the data center as a standardized commodity, an off-the-shelf, turnkey black box? Indeed, the intitiative’s codename is Project Blackbox – and the prototype container that Sun is showing off is painted black.

The effort reflects Hillis’s belief that computing is fated to become a utility, writes Markoff:

Long an advocate of the concept of utility computing, analogous to the way electricity is currently delivered, Mr. Hillis said he realized that large companies were wasting significant time assembling their own systems from small building blocks. “It struck me that everyone is rolling their own in-house and doing manufacturing in-house,” he said. “We realized that this obviously is something that is shippable.”

In many ways, the containerized data center resembles the standardized electricity-generation system that Thomas Edison sold to factories at the end of the 19th century and the beginning of the 20th. Manufacturers bought a lot of those systems to replace their complex, custom-built hydraulic or steam systems for generating mechanical power. Edison’s off-the-shelf powerplant turned out to be a transitional product – though a very lucrative one. Once the distribution network – the electric grid – had matured, factories abandoned their private generating stations altogether, choosing to get their power for a monthly fee from utilities, the ultimate black boxes.

Something similar will happen – is happening – with computing, but how exactly computing assets end up being divided between companies and utilities remains to be seen. In the meantime, commodity data centers, in various physical and virtual forms, should prove increasingly popular to companies looking to radically simplify their computing infrastructure and reduce the single biggest cost of corporate computing today: labor.

UPDATE: Dan Farber covers the launch of the Blackbox, while Jonathan Schwartz makes Sun’s marketing pitch and Greg Papadopoulos puts the machine into the context of data-center evolution. Blackfriars calls it “the ultimate computing commoditization play,” saying it “changes the economics” of data center construction. Techdirt is skeptical about the size of the market: “The sweet spot of companies for whom this will be ideal seems small. Its impact on Sun’s business won’t be as significant as what it represents, the continuing commoditization of corporate infrastructure.” Sun’s Tim Bray writes, “I have no idea how big the market is. But I’m glad we built it, because it is just totally drop-dead fucking cool.” (Question for Miss Manners: Is it kosher for a corporate blogger to use the f-word?)

Excuse me while I blog

Blog. Blog.

Say it five times in a row, preferably out loud: Blog. Blog. Blog. Blog. Blog. Has there ever been an uglier word? You don’t say it so much as you expectorate it. As though it carried some foul toxin that you had to get out of your mouth as quickly as possible. Blog! I think it must have snuck into the language in disguise. Clearly, it was meant to mean something very different. I’d guess it was intended to be a piece of low slang referring to some coarse bodily function.

Like: “Can we pull over at the next rest area? I really have to blog.”

Or: “The baby was up all night blogging.”

Or: “Oh, Christ, I think I just stepped in a blog.”

But somehow it escaped its scatological destiny and managed to hitch itself, like a tick, to a literary form. Who’s to blame? According to Wikipedia, which, needless to say, comes up as the first result when you google blog, Peter Merholz is the man whose name shall live in infamy. While Jorn Borger introduced the term “web log” – on December 17, 1997, to be precise – it was Merholz who “jokingly broke the word ‘weblog’ into the phrase ‘we blog’ in the sidebar of his blog Peterme.com in April or May of 1999. This was quickly adopted as both a noun and verb.” A passing act of silliness for which we all must now suffer. Thank you, Peter Merholz.

It doesn’t seem fair. No other literary pursuit is saddled with such a gruesome name. No one feels ridiculous saying “I am a novelist” or “I am a reporter” or “I am an essayist.” Hell, you can even say “I am an advertising copywriter,” and it sounds fairly respectable. But “I am a blogger”? Even when you say it to yourself, you can hear the sniggers in the background.

Imagine that you, a blogger, have just become engaged to some lovely person, and you are now meeting that lovely person’s lovely parents for the first time. You’re sitting on the sofa in their living room, sipping a cape-codder.

“So,” they ask, “what do you do?”

A tremor of shame flows through you. You try to say “I am a blogger,” but you can’t. It lodges in your throat and won’t budge. Panicked, you take refuge in circumlocution: “Well, I kind of, like, write, um, little commentaries that I, like, publish on the Internet.”

“Little commentaries?”

“Yeah, you know, like, commentaries.”

“About what?”

“Well, generally, they’re commentaries that comment on other commentaries.”

“How fascinating.”

You’re getting deeper into the mire, but you can’t stop yourself. “Yeah. Usually it starts with some news story, and then I and a whole bunch of other people, other commentarians, will start commenting on it, and it’ll just go from there. I mean, imagine that there’s this news story and that a whole bunch of mushrooms start sprouting off it. Well, I’m one of those mushrooms.”

Face it: even “fungus” is a nicer word than “blog.” In fact, if I had the opportunity to rename blogs, I think I would call them fungs. Granted, it’s not exactly a model of mellifluousness either, but at least its auditory connotations tend more toward the sexual than the excretory. “I fung.” “I am a funger.” Such phrases would encounter no obstacle in passing through my lips.

But “I am a blogger”? Sorry. Can’t do it. It sounds too much like a confession. It sounds like something you’d say while sitting in a circle of strangers in a windowless, linoleum-floored room in the basement of a medical clinic. And then you’d start sobbing, covering your face with your hands. And then the fat woman sitting next to you would put her hand on your back. “It’s all right,” she’d say. “We’re all bloggers here.”

Easy as pie

“News is not like the symphony,” writes Dave Winer, “it’s like cooking dinner.” He’s “totally sure,” he says, that he knows how the future of news will play out:

In ten years news will be gathered by all of us. The editorial decisions will be made collectively, and there will be people whose taste we trust who we will turn to to tell us which stories to pay attention to … The role of gatekeeper will be distributed, as will the role of reporter. Very few people, if any, will earn a living doing this, much as most of us don’t earn a living by cooking dinner, but we do it anyway, cause you gotta eat.

Yesterday, the Associated Press reported on the many journalists who have been killed covering the Iraq war:

Western journalists covering the war in Iraq face sniper fire, roadside bombs, kidnappers and a host of other dangers. Their Iraqi colleagues must cope with even greater risks, including families attacked in retribution for sensitive reporting, and arrest on suspicion of links to the violence journalists cover.

At least 85 journalists – mostly Iraqis – have been killed since the U.S.-led invasion in March 2003 – more than in either Vietnam or World War II. The security situation is getting progressively worse, and 2006 has been the deadliest year yet, with at least 25 journalists killed to date.

A week ago today, the Russian reporter Anna Politkovskaya was murdered. She was “shot in the chest as she was getting out of an elevator, then shot in the head.” The same day, two German reporters were murdered inside the tent they had pitched on the side of a road in Afghanistan. Last year, 47 reporters were killed while doing their jobs. The year before that, the death toll was 53.

“It’s easier for readers to become reporters,” Winer says, “than it is for reporters to become readers.”

Thanks for the insight.

United States vs. Google

Every era of computing has its defining antitrust case. In 1969, at the height of the mainframe age’s go-go years, the Justice Department filed its United States vs. IBM lawsuit, claiming that Big Blue had an unfair monopoly over the computer industry. At the time, IBM held a 70 percent share of the mainframe market (including services and software as well as machines).

In 1994, with the PC age in full flower, the Justice Department threatened Microsoft with an antitrust suit over the company’s practice of bundling products into its ubiquitous Windows operating system. Three years later, when Microsoft tightened the integration of its Internet Explorer browser into Windows, the government acted, filing its United States vs. Microsoft suit.

With Google this week taking over YouTube, it seems like an opportune time to look forward to the prospect – entirely speculative, of course – of what could be the defining antitrust case of the Internet era: United States vs. Google.

That may seem far-fetched at this point. In contrast to IBM and Microsoft, whose fierce competitiveness made them good villains, Google seems an unlikely monopolist. It’s a happy-face company, childlike even, which has gone out of its way to portray itself as the Good Witch to Microsoft’s Bad Witch, as the Silicon Valley Skywalker to the Redmond Vader. And yet, however pure its intentions, Google already has managed to seize a remarkable degree of control over the Internet. According to recent ComScore figures, it already holds a dominant 44 percent share of the web search market, more than its next two competitors, Yahoo and Microsoft, combined, and its share rises to 50% if you include AOL searches, which are subcontracted to Google. An RBC Capital Markets analyst recently predicted that Google’s share will reach 70 percent. “The question, really,” he wrote, “comes down to, ‘How long could it take?'”

Google’s AdSense ad-serving system, tightly integrated with the search engine, is even more dominant. It accounts for 62 percent of the market for search-based ads. That gives the company substantial control over the money flows throughout the vast non-retailing sector of the commercial internet.

With the YouTube buy, Google seizes a commanding 43 percent share of the web’s crowded and burgeoning video market. In a recent interview, YouTube CEO Chad Hurley said that his business enjoys a “natural network effect” that should allow its share to continue to rise strongly. “We have the most content because we have the largest audience and that’s going to continue to drive each other,” he said. “Both sides, both the content coming in and and the audience we’re creating. And it’s very similar again to the eBay issue where they had an auction product that gained critical mass.”

Google has been less successful in building up its own content and services businesses, but it’s a fabulously profitable company, thanks to its AdSense money-printing machine, and it can easily afford to acquire other attractive content and services companies. It can also afford, following the lead of Microsoft in the formative years of the PC market, to launch a slew of products across many different categories and let them chip away at their respective markets – which is exactly what it’s been doing. Moreover, its dominance in ad-serving enables it to cut exclusive advertising and search deals with major sites like MySpace, expanding its influence over users and hamstringing the competition.

Google’s corporate pronouncements are carefully, and, by all accounts, sincerely, aimed at countering fears that it is building a competition- and innovation-squelching empire. But its actions often belie its rhetoric. Its founders said they had no interest in launching an internet portal, but then they launched an internet portal. They said they wanted customers to leap off Google’s property as quickly as possible, but then they began cranking out more and more applications and sites aimed at keeping customers on Google’s property as long as possible. The company’s heart may be in the right place, but its economic interests lie elsewhere. And public companies aren’t known for being led by their hearts.

Nothing’s written in stone, of course. Someone could come up with a new and more attractive method of navigating the web that would quickly undermine the foundation of Google’s entire business. But it’s useful to remember that the commercial internet, and particularly Web 2.0, is all about scale, and right now scale is very much on Google’s side. Should Google’s dominance and power continue to grow, it would inevitably have a chilling effect on innovation and hence competition, and the public would suffer. At that point, the big unasked question would start being asked: should companies be able to compete in both the search/ad business and the content/services business, or should competition in those businesses be kept separate? If there is ultimately a defining antitrust case in the internet era, it is that question that will likely be at its core.

Innovation, not infrastructure

Salesforce.com’s announcement Monday that it would open up its proprietary Apex programming language to customers and other developers marks an important turning point for the company. It’s been clear, at least since the announcement of its AppExchange software marketplace a year ago, that Salesforce’s ambitions go well beyond providing a simple customer relationship management system. With Apex, those ambitions come into clear focus: Salesforce doesn’t want to be your CRM supplier; it wants to be your data center. It wants to underpin and run all your enterprise applications, while giving you the tools to customize them. Its original slogan “Success, Not Software” appears to be morphing into a new one: “Innovation, Not Infrastructure.” That phrase appears, in fact, in the press release announcing Apex (though it’s spoken by an AMR researcher).

Last year, I questioned Salesforce’s decision to run its software-as-a-service application on its own infrastructure rather than have that infrastructure hosted by a hardware utility. Now, I understand the rationale for the decision: the infrastructure is the product. While Salesforce’s move opens up new opportunities for the firm, it also dramatically widens the competition it will face. Everyone from Microsoft to Google to Amazon is moving into the business of being an infrastructure utility. And, in an age of standardization, it will be interesting to see how customers react to the idea of running their enterprise applications in a private language. Is Salesforce the SAP of the SaaS world – and is that a good or a bad thing?

Dan Farber reports that venture capitalist Mark Gorenberg views Apex not just as a turning point for Salesforce but as a watershed for corporate computing in general. Gorenberg “hailed Apex as the most significant announcement since Sybase announced stored procedures. In effect, Gorenberg said, stored procedures led to the change from mainframe to client/server computing. ‘Apex will be the big tsunami for a new platform for applications,’ he concluded.” That’s a bit of an overstatement, but what Salesforce is doing is certainly part of a big tsunami in business computing, a tsunami that does indeed mark the transition away from the client-server age into what I’ve called the third age of IT. At the center of that age will stand not the PC but the utility-class data center, providing companies with at once greater efficiency and greater flexibility.

UPDATE: In another post, Dan Farber interviews Salesforce’s tech guru Parker Harris, who goes deep into Apex’s technical details. Meanwhile, Sinclair Schuller makes the case against what he terms the Apex “handcuffs.”

A glass house

Michael Arrington, founder of the popular technology news blog TechCrunch, traveled to Washington, DC, last weekend to participate in a panel discussion at the annual conference of the Online News Association. He apparently used the occasion to excoriate newspapers and the journalists who work for them, directing his ire most pointedly at the New York Times and its coverage of technology. He cast particular aspersions on an October 2005 Times story on a startup named Inform. As another panelist, the blogger Jeff Jarvis, reported, “Arrington launched attacks on news media, contending that journalists will be losing their jobs and that reporters are fools if they don’t quit and start blogs. He then tried to sucker-punch The New York Times, arguing that the only reason the paper could have written a favorable story about the startup Inform was if the reporter or editor had ties, financial or otherwise, with the firm.”

Arrington himself, on his personal blog, wrote about his experience:

I made a few main points when I spoke. I said that Digg was more interesting to me than the New York Times because the crowd determines what’s on the home page, not some editor I neither know nor necessarily trust. I also made some points about journalism in general after a few defensive flurries were sent my way. First, that most mainstream media isn’t interesting to me because they report news so late. By the time something hits the New York Times, it’s usually at least a day old in the blogosphere. Second, I was discouraged by the fact that there is no discussion in mainstream media. Publications never cite their competition, and readers cannot say what they think (as they can with blog comments). And third, I encouraged journalists who were stuck in the big media machine, with their career going nowhere, to consider blogging as an alternative … I also called out the New York Times in particular – their recent launch of an offline new reader showed that they don’t get what consumers really want, I said. And I also said that many of the fluff pieces in the Times technology section must either be generated from back scratching, or lack of understanding of the product … Instead of sparking an intelligent debate I was roundly attacked. It’s the first time I addressed “real” journalists head on, and all I saw was fear, loathing and disdain.

When he called into question the ethics of the New York Times reporter, Arrington was upbraided both by his co-panelist and by the audience. Jarvis wrote: “I challenged him immediately, saying that this is a grave charge and that he clearly had no facts to back it up; he said as much. I also made it clear that Inform is, in some ways, a competitor with Daylife and that Arrington is also an investor in Daylife. It didn’t stop him. He repeated this attack, among others, on The Times. It was most uncomfortable, even embarrassing.” Staci Kramer of the PaidContent blog, who attended the session, also reported that Arrington “accused an NYT reporter of going in the tank on a story, then apologized when confronted directly by the NYT’s Jim Roberts, who challenged him to provide facts or back down. His reply: ‘I apologize. I have no facts to support my statement.'”

As I read of Arrington’s disparaging comment about the Times article, I recalled, with some disquiet, an exchange I had with him back in early February. At the time, there was a heated discussion going on about conflicts of interest among bloggers, a discussion spurred by a February 9 Wall Street Journal article by Rebecca Buckman on the lack of clear ethical standards in the blogosphere. Buckman interviewed Arrington for her article, and he stressed that, though he both wrote about Web 2.0 startups and acted as an adviser to some of them, he was always careful to disclose any possible conflict of interest. I had been following Arrington’s blog, and as I read his statements in the Journal I was reminded of what I had previously sensed to be a possible conflict related to his long-undisclosed involvement with a startup named Edgeio. After finishing the Journal article, I read back through some of Arrington’s writings, and, on the morning of February 10, I drafted the following post, which I intended to publish on this blog:

Rebecca Buckman’s controversial Wall Street Journal article about possible conflicts of interest in the blogosphere has provided a good occasion for taking stock (no pun intended). As Buckman wrote, the ethics of blogging “can be a murky issue in today’s clubby blogosphere, where many people including venture capitalists, lawyers and journalists write about Web issues and companies – and often, each other – with little editing. The rebound in Silicon Valley’s economy, coupled with the popularity of cheap, easy-to-use blogging tools, means there are more aspiring commentators than ever opining about start-ups and tech trends on the Web. And increasingly, it is difficult to discern their allegiances.”

One example of the murkiness that surrounds bloggers’ allegiances can be found in Michael Arrington’s much-read and much-quoted TechCrunch blog. Arrington does a great job of covering new “Web 2.0” companies and services. His reviews are smart and succinct, and they get a lot of attention. Arrington and his blog are mentioned in Buckman’s article:

“One popular blog that often writes positively about young tech companies, TechCrunch, is run by a lawyer and entrepreneur, Michael Arrington, who occasionally serves as an adviser to companies he has written about. He sometimes receives stock in those small companies, he says. But Mr. Arrington says he generally doesn’t write about start-ups he’s advising after he becomes affiliated with them -and ‘if I did, I would put a disclaimer up’ on the blog, he says.”

Arrington’s to be applauded for his commitment to disclosure, but the issue may be more complicated then it at first seems. What happens, for instance, when Arrington reviews new sites or services that compete with ones he’s affiliated with? Does he – should he? – also disclose his interests in those cases? How does one balance one’s role as an unbiased reviewer with one’s role as a paid adviser?

As Buckman notes, Arrington is an entrepreneur as well as a blogger and an adviser. In particular, he’s a cofounder of a soon-to-be-launched company named Edgeio. Edgeio began to unveil itself earlier this week when Arrington’s partner, Keith Teare, talked about the service in a presentation. Business Week’s Rob Hof attended the presentation and summed up Edgeio’s business model: “essentially, Edgeio is doing just what its tagline says: gathering ‘listings from the edge’ – classified-ad listings in blogs, and even online product content in newspapers and Web stores, and creating a new metasite that organizes those items for potential buyers.” As an aggregator of personal ads, Edgeio will compete with such powerhouses as eBay and Craigslist. It will also compete with Google Base, another new service that provides an alternative way to aggregate classified advertising.

Back on October 25, 2005, as rumors about Google Base first began to swirl around the blogosphere, Arrington wrote this about Base on his blog: “Google Base appears to be a service to publish content directly to google and have them host it in a centralized way. If so, this would be going completely against the accelerating trend of decentralized publishing. My prediction: when the dust settles, this will either be largely ignored or universally hated. Centralized content is boring … so much is going on at the edge of the web, why would anyone try to put it all back in the center?” Arrington didn’t mention his connection with Edgeio in the post, though, in retrospect, his comment that “so much is going on at the edge of the web, why would anyone try to put it all back in the center?” describes one of the key assumptions underlying Edgeio’s service – and one of its key points of differentiation from Base.

Clearly, Edgeio must have been much on Arrington’s mind at the time. Four days before the Base post, on October 21, TechCrunch had hosted its third “Meet-up” event, one of whose sponsors was Edgeio. Two days after his Base post, on October 27, he announced his connection with Edgeio in a post, noting that he and his cofounders had been working on the service “for most of this year.” He provided a teaser about what Edgeio would do, without going into any details: “Edgeio will give you the ability to do new and (we think) really exciting things with your blog. If you have a weblog and you’d like to be part of early testing, there is a field for giving us your blog address as well.”

On November 11, 2005, Google officially launched its Base service, Arrington immediately trashed it in a review titled “Google Base Launched. Yuck.” His “bottom line” assessment of Google Base went as follows: “This is not a very interesting application in its current form. Keith Teare says it’s like a 1985 dBASE file with less functionality. It’s ugly. It’s centralized content with less functionality than ebay or craigslist. The content is not integrated directly into Google search results, but ‘relevance’ can bump it up into main and local search (and froogle).”

Again, no mention of his involvement with a company that would compete with Google Base.

Arrington’s reviews of Base are entirely reasonable. Some people praised Base when it appeared, but plenty of other people were highly critical of it, in the same ways that Arrington was. There’s no reason to think that his reactions to Base were anything but sincere. At the same time, it’s hard to believe that they weren’t colored by his involvement with Edgeio. The story underscores one of the tricky questions that Buckman’s article has brought to the surface: How many hats can a blogger wear?

I hesitated before publishing the post. I thought it only fair to ask Arrington for his perspective. So, on the morning of February 10, I sent him the following email:

Dear Mr. Arrington,

I write the Rough Type blog (roughtype.com) and am a dedicated reader of your TechCrunch blog. In the wake of the WSJ blogging story, I’ve been thinking about the complex issue of disclosure in the blogosphere. It fits with a broader subject that interests me deeply: the reliability of information provided on the web and, particularly, provided through the Web 2.0 model. I noted your statement on the subject of disclosure in the WSJ piece and wanted to follow up with you for a followup post I’m writing on this subject.

You note in the Journal that you avoid writing about companies you’re affiliated with. But what about writing about companies that compete with companies you’re affiliated with? I’m thinking, in particular, of two critical reviews you posted about Google Base last fall (Oct. 25 and Nov. 11). You didn’t mention in either of those posts that you were a cofounder of a company, Edgeio, that would compete with Base for listings. In retrospect, do you think you should have disclosed the Edgeio affiliation? And if not, why not? More broadly, how do you think about striking the right balance between being an impartial reviewer and also pursuing business interests?

I don’t mean to put you on the spot. I just want to make sure I know your perspective on the subject.

Thanks,

Nick Carr

Some hours later, Arrington responded with this email:

check out my posts on oodle, a direct competitor, and other classifieds companies like Microsoft Expo.

Today is a bad day to make accusastions like this. http://www.crunchnotes.com/?p=144

Before you post an attack piece, please make sure you research the facts.

Mike

I was disappointed that he didn’t bother to answer or even acknowledge my questions, but I did find and read his reviews of Oodle and Microsoft Expo. And they were positive reviews. About Oodle he wrote: “Oodle is all about decentralized content, a theme I constantly talk about, and I’m in their corner.” About Expo he wrote: “I have been testing the service, and there are features that are top notch. This is going to be an impressive product.” In neither case, though, did he disclose that he was engaged in launching a competing classifieds site.

Nevertheless, it seemed to me that, even though I believed he should have disclosed what I viewed as a conflict of interest, he was not being duplicitous. Also, I felt genuinely sorry that he had been called a racist that day (as the link in his email revealed). So I decided to give him a break and not run the post I’d written. I informed him of my decision in the following email:

Thanks for the response and for pointing me to the Oodle and Expo reviews. I’m not pursuing this.

Sorry about the grotesque racism charge.

Nick

The next day, he wrote back:

Nick,

I apologize for my email yesterday. I was a total jerk. It was a very bad day, but that is no reason to take it out on you.

You raise a good point and it is something that I have to deal with in an honest way.

Mike

That same day, Arrington wrote a brief post about Edgeio, which began, “Edgeio is a startup that I co-founded with Keith Teare last year. Because of the clear conflict of interest I won’t be writing about edgeio that much on TechCrunch.” He made no mention of his earlier critiques of other classifieds sites.

I’m not sure if I did the right thing in withholding the post I’d written. I think it raised valid issues – issues that bloggers should be struggling with, rather than ignoring – and though I believed Arrington was giving an honest review of Google Base, I also believed that the review was influenced by his involvement with a competing company pursuing a different strategy – a strategy that was implicitly promoted in his criticism of Base’s strategy (and in his praise of Oodle’s strategy). Arrington, I still think, made a mistake in failing to disclose his financial interest in a classifieds site when he reviewed other classifieds sites. I have no doubt that if he had written such reviews for the Times, or pretty much any other newspaper, without disclosing his conflict of interest, he would have lost his job – and rightly so. It seems strange, in this light, that he would choose to question in a public forum the integrity of a newspaper and one of its reporters.

But maybe it’s not so strange. Blogging is a new and immature medium, with few rules and few traditions, and bloggers have a tendency to think of themselves as being liberated from the constraints of traditional media. That’s only natural. But it’s also, in large measure, an illusion. Many of the constraints that reporters operate under evolved over the years in order to temper the freedoms that could lead, and sometimes did lead, to the abuse of the public trust. Traditional journalism has its weaknesses, as any journalist will tell you, but it has many strengths as well, strengths that are hard-earned and worthy of respect. Many bloggers assume that blogging represents a step forward when, in important ways, it actually represents a step backward.

When it comes to conflicts of interest, or other questions of journalistic ethics, the proper attitude that we bloggers should take toward our counterparts in the traditional press is not arrogance but humility. In this area, as in others, blogs have far more to learn from newspapers than newspapers have to learn from blogs.