Info Aggregator Review

MasterNewMedia.org (Robin Good) has an excellent review of our Info Aggregator with a rating as a “breakthrough” service.

Info Aggregator is a breakthrough service offered by the skilled team of developers at BlogStreet.

For the amount of time that it has taken me to set it up and to start making good use of it I would rate this service as excellent.

Info Aggregator is only one among several smart, useful and effectively designed tools and services made freely available by BlogStreet. If you are a Webmaster, independent publisher or Weblog/Blog author you should consider giving a serious look at this site.

In particular I am very impressed by the amount of quality ideas and services that BlogStreet has recently been able to come up with. True, they have still a long way to go. But if I can judge by the ingeniousness of the services developed so far they look well set for a successfull growth.

My highest kudos to Blogstreet for the great work done so far while looking forward to greater ease of use, support and stupid-proof documentation.

Robin Good (or perhaps it should be Luigi Canali De Rossi?) also points out areas where we need to work more on. As I wrote in an email to him: “When we launched, we weren’t sure how helpful this would be to people (it was very useful for me, considering the reading I did!) In the past few months, we have seen good interest and will now increase our efforts to make this a world-class service.” Thanks very much, Robin!

Shifting IT Jobs

Robin Miller takes a closer look at the “problem” of IT jobs shifting away from the US to other countries like India and concludes:

In the end, like it or not, we here in the U.S. are going to have to learn how to deal with a truly worldwide IT economy. Some IT workers here may be forced to leave the “computer industry” and move into non-offshorable jobs, but this may not mean they give up doing computer work, because as our economy continues to shift away from manufacturing and toward services, we may see just as many non-portable IT “support” jobs created as we would if we decided our economic future was best served by trying to turn our economy back to its traditional dependence on manufacturing.

The upshot: Even though hundreds of thousands of programming and other IT jobs are likely to leave the U.S. over the next few decades, the vast majority of U.S. IT workers will survive, and possibly even prosper in the end, although they may have new employers and work in new fields. As trucking companies expand and become increasingly computerized, for example, new jobs maintaining mobile data systems will be created.

The trick to staying gainfully employed in the IT industry — and to breaking into it — is, as always, a matter of spotting growth areas and moving toward them. This doesn’t just mean learning new programming languages or how to build, install, and repair new types of hardware, but also keeping up with business news to see what industries may offer the greatest future opportunities.

And those industries will probably not be “computer” industries in the traditional sense. As computing devices become more common in places they weren’t traditionally a major factor — which can be anywhere from a tomato packing plant to a ready-mix cement distributor — so will computing jobs.

Your next “IT job” may be in an industry you didn’t even think about a few years ago. It may be in a place you never thought of as an “IT mecca.” But if you have solid skills, whether as an entry level programmer or sysadmin or as a top-level IT manager or CIO, some company out there almost certainly needs someone just like you. The trick is finding that company — but that’s another article for another day.

I think the US IT professionals should look at the emerging markets as opportunities. These are IT’s next markets. India needs affordable IT solutions, so does China and so does Brazil and so does Africa. The US has always been good at innovating. Can it now come out with solutions (hardware and software) that are a tenth of the current costs? Do it, and see the amazing large markets open up.

There is a digital divide between the developed markets and the emerging markets. The opportunity lies in crossing this chasm and creating disruptive innovation for technology’s next markets.

Genetic Algorithms

Steven Johnson writes:

[A genetic algorithm] creates a random population of potential solutions, then tests each one for success, selecting the best of the batch to pass on their “genes” to the next generation, including slight mutations to introduce variation. The process is repeated until the program evolves a workable solution. Originally developed in the 1960s by John Holland at the University of Michigan, genetic algorithms are increasingly being harnessed for real-world tasks such as designing more efficient refrigerators.

Genetic algorithms make it possible for computers to do something profound, something that looks an awful lot like thinking. And that little animated figure learning how to walk showcases some design developments that permit computers to make their own decisionswithout guidance from humans.

Bill Gross [of Idealabs] believes genetic algorithms have the potential to revolutionize engineering. Instead of using software as merely a visualization tool that helps draw a contraption, he envisions genetic algorithms that can handle the entire design process. You define your organism, your genes, and your fitness function and let the software do the hard work of actually figuring it out.

“I think this is the way engineering should be done: Instead of defining your part or your circuit board, define your objective and let the software evolve the answer. Let’s say I want a table. Instead of drawing out a table, you say, My constraints are these: I want a plane at this height, with this sideways rigidity, and so on. And then you tell the software, OK, you’ve got bars, beams, screws, boltsmake the best thing you can at the lowest cost.”

Genetic algorithm advocates often talk about their software in the language of ecosystems: predators and prey, species and resources. But Gross has another idealess rain forest and more assembly line. “Let’s say you give the software access to the entire McMaster-Carr industrial supply catalog. They have 400,000 parts in stock: screws, bolts, hinges, everything. So you’ve got the whole gene pool of those parts available.” Somewhere in that mix is the machine you’re dreaming of, and simulated evolution may well be the fastest way to find it.

“You state your objectives, let the thing evolve with the optimum combination of parts at the lowest price, and the machine will be there this afternoon,” Gross says, his voice rising with excitement. “That’s an extreme exaggerationbut not that extreme!”

If I had to take time off from daily work, then the one area I’d like to work on is this!

Auditing Warren Buffet

Baseline takes a somewhat critical look at Warren Buffet’s management style (not often one sees that happening): “Shareholders nearly deify Warren Buffett and now he’s an advisor for Arnold Schwarzenegger. But that doesn’t mean other companies can or should follow the way the avuncular champion of business ethics conducts his own affairs.”

New-Look BlogStreet

BlogStreet sports a new look. The various sections have been categorised into four:
– Blog Profile
– RSS Ecosystem
– Search and Directory
– Blog Tops

This is our third iteration of the design since we launched a year ago. It was mandated by the fact that we had launched a lot of features over the course, but they were not getting adequate exposure. The redesign should help.

We also have added an RSS Publisher for creating one’s own standalone RSS feeds.

Continue reading

Microcontent Wiki

Richard MacManus brings forth an interesting viewpoint: “one part of the Writeable Web is often overlooked: weblog comments. Often some of the best nuggets of content can be found buried in a comment attached to a weblog post. I’ve even coined a phrase for this: Microcontent Wiki, which is defined as: Weblog Post + Comments. It’s microcontent because it’s usually content based around a single theme or topic (defined by the weblog post). And it’s like a Wiki because anyone can write a comment on a weblog, so it has a similar collaborative feel to a Wiki. The problem is, currently we don’t have an easy way to track Microcontent Wikis. We can subscribe to RSS feeds for weblogs and even topics (k-collector), but weblog comments aren’t as simple to aggregate.”

I agree…that is why there is a comments RSS feed for Emergic.org.

But what Richard is talking about is slightly different: “I’d like to be able to track comments on other peoples sites, but post-by-post only. In other words I’d like to de-couple bits of content from their various locations – particularly if they’re buried in a weblog comments system – and collect them together in my RSS Aggregator.”

Richard summarises this as: Weblog Post + Comments = Microcontent Wiki. “Content is always going to be tightly coupled to location. This is especially so in a weblog, where the location will be a URL. But even in a Wiki, or a Microcontent Wiki as I’ve described it (weblog post + comments), there is a central location where content on a specific topic is aggregated. The key is to make it easy to subscribe to all the “locations” that interest you. Currently it’s easy to subscribe to weblogs using RSS. Now we want to make it easy to subscribe to microcontent.”

Hmm…there isn’t an easy solution to this! But it sure would be good to have. Many times, comments are left by people with viewpoints but who may not have a blog or may want to participate in an ongoing discussion on the original blog.

This is yet another example of the proliferation of microcontent, as is noted by Don Park, who beautifully links blogs and wikis:

Imagine posts and comments flowing from blogs to wikis like the way streams feed into lakes. Got the picture yet? Now think of a blog category as a wiki page. The picture changes so that the blog becomes a mountain and categories become the streams running down the side of the mountain in all directions toward wikis into which streams from other mountains also feed into.

Here are some decorations to complete above picture:
– rain is the news that bombard us daily
– rocks that form the mountains are our experiences
– volcanic eruptions are our rants
– flash floods are sudden spikes of activitiy
clouds are news generators like North Korea or Saddam Hussein

The RSS revolution has just begun.

TECH TALK: The Death and Rebirth of Email: Solution Ideas (Part 5)

At the core of the Internet email ecosystem is SMTP. The growth in spamming in recent times has brought into short focus the limitations of the protocol. News.com recently wrote about an ongoing discussion on replacing SMTP:

Developed when the Internet was used almost exclusively by academics, the Simple Mail Transfer Protocol, or SMTP, assumes that you are who you say you are. SMTP makes that assumption because it doesn’t suspect that you’re sending a Trojan horse virus, posing as a relative of a deposed African dictator to make fraudulent pleas for money, or hijacking somebody else’s computer to send tens of millions of ads for herbal ViagraAt issue is the protocol’s lack of a comprehensive way of verifying an e-mail sender’s identity. This makes it easy for people to mask their identities by forging return addresses and taking over victim machines to conduct their activities.

Some say rewriting SMTP from the ground up would be prohibitively difficult because of the protocol’s global user base, which is estimated to be in the hundreds of millions. “The difficulty of changing the transfer technology as a way of managing unsolicited bulk e-mail is the installed base,” said Rodney Tillotson, the chair of the Anti-Spam Working Group for the Reseaux IP Europeens (RIPE), a consortium of European Internet service providers.

“There are thousands or millions of SMTP servers transferring and delivering mail, and getting them all changed will take years, during which time the (unsolicited bulk e-mail) problem probably remains unsolved,” Tillotson said. “Proposals requiring a change to desktop mail software are even harder to deploy.”

Sluizer counters this by suggesting two protocols–SMTP and a new one, with tighter authentication–could easily coexist, with e-mail applications supporting both side by side. In that way, people using one protocol would not be prevented from exchanging mail with those using another.

Eric Rescorla does not see the need to change SMTP:

One of the many ideas proffered for stopping spam is to require positive authentication for every email message sent. This would have a number of benefits:

1. Make forgery much more difficult, thus making it easier to track down spammers.
2. Allow the maintenance of black lists of known spammers.
3. Allow readers to use white lists of people known to send legitimate mail. (This doesn’t work currently because email addresses are often forged).

The bad news, however, is that authentication probably won’t work, for three very important reasons:

1. In this context, authentication probably means cryptographic authentication. This means issuing certificates (or something like them) to every mail sender in the network. This is another case of assuming you have a can opener.
2. Since spam zombies are now taking control of legitimate user’s machines, they will be able to send mail as those users, which will make any authentication system pretty much irrelevant.
3. Because of relaying, it’s very difficult to figure out whether any given machine is a legitimate sender of mail for a given address. For instance, how do you know that foo.isp.com is a legitimate source of mail from ekr@rescorla.com?

Due to these difficulties, full email authentication is probably a non-starter. It gets brought up pretty much every time stopping spam is mentioned, but noone really knows how to deploy it.

What does this mean for SMTP?
You may have noticed that of the three reasons I just gave, only one (relaying) has anything to do with SMTP and even that is a topological feature of the Internet rather than SMTP proper. We could perfectly well ban or restrict relaying and SMTP would continue to serve perfectly well–provided that we got the topology right so that mail could still be delivered, but that’s not an SMTP issue either.

The truth of the matter is that not only is it trivial to retrofit SMTP to add server authentication, it’s already been done. RFC 2246 describes how to use TLS with SMTP. The only problem is that due to points (1) and (3) above, noone has appropriate certificates to authenticate with and so its mostly used for confidentiality (data secrecy) not authentication. But remember that that’s not a problem with SMTP either.

As far as I know (and I’m pretty close to this) there’s basically nothing that people have proposed as an anti-spam measure that SMTP can’t easily be modified to do. The movement to ditch SMTP strikes me as more of a howl of frustration at our collective inability to deal with spam than an actual reasoned argument for change.

Tomorrow: Solution Ideas (continued)

Continue reading