Emergic Grid Team

Netcore is creating a utility computing platform to enable affordable and manageable computing, as part of our vision for tomorrow’s world. We believe that this platform will be the way computing will be made available to the next billion users.

We are growing our Emergic Grid development team which is building this centralised computing platform. We need people with a strong computer science academic background. Industry experience is a must. Positions are available at any of our four offices in Mumbai, Pune, Bangalore and Chennai.

We are working on research and development of cluster and grid products and associated cluster and high availability and manageability infrastructure tools. Our team is responsible for research, design, development of state of the art high availability and manageability infrastructures, that makes applications easy to deploy and diagnose, provide continuous availability and ease-of-use. We work on challenging problems, in the areas of distributed services, high availability, configuration, grid management, workload management, and monitoring and supporting single system image.

Tech Leads (3)

Technical lead, who could work well in a team, define new projects, provide direction and mentor others. BS or MS degree and significant software design and development experience in one or more of the following areas–operating systems cluster and distributed systems, distributed file systems, storage systems, Linux kernel, high availability systems. A minimum of seven years of software engineering or related experience required. Must have system level programming skills in C and C++. Must have good communication skills and must be fluent in English. Proven track record to define, build and ship products in a timely manner Good communication and teamwork skills, good understanding of grid computing and knowledge of existing clustering and high availability products, and Networking are required.

Project / Product Manager (1)

Job Description:

  • Person must be able to handle all three areas described below.
  • Release management Responsible for creating and managing software release processes.
  • Project management Responsible for managing timelines, coordinating across different functional areas.
  • Product management Competitive analysis of the product. Create requirement specifications, track industry trends in similar products, come up with new product ideas. Interact with business partners.

    Requirements: BS or MS degree or equivalent experience relevant to functional area. A minimum of four years of software engineering or related experience required. Previous experience as product manager/program manager. Must have excellent communication and coordination skills.

    If you are interested, please email me or fill out the feedback form on the blog.

  • Computing for Broadband 101

    An article I wrote for Business Today (February 13 issue, page 18):

    The year 2005 may well be the year of broadband in India. Or so, it seems. With BSNL and MTNL launching their services across India, other players too are getting ready for a price war. But before the much awaited broadband boom happens in India, we still have two important hurdles to cross before we can replicate the heady growth of mobile phones in India over the past five years — the cost and complexity of computers.

    Ten years after VSNL launched commercial Internet operations in India, broadband promises a major upgrade to the quality of services that are available. Broadband is all set to usher in a variety of services across entertainment, e-governance, telemedicine, education and software applications. For us to realise the true potential of broadband connections, we will need to first rapidly ramp up the installed base of computers in India from the present 14 million.

    Computers consist of hardware and software (the applications). Even as the hardware is becoming more affordable (a low-end computer will cost about Rs 15,000), software costs have risen as a percentage of the total outlay. So far, many in India have used piracy or non-consumption as solutions. Both are not good enough to boost usage and build a software and content developer ecosystem which increases the value of the computer by making more services available to end users.

    Besides the affordability of the full solution, the other issue which needs to be tackled is that of manageability. Viruses and spyware have made life difficult for less savvy users. Backing up data from ones desktop is not something natural either. Support, especially for home users, is not easy to get from the vendors.

    If broadband has to boom in India, the computing industry will need two innovations to reinvent both its architecture and business model. After all, what will people do with fat pipes without affordable and manageable access devices and a variety of services for users to access.

    To reinvent the computing architecture, we need to take a leaf out of the industrys past in centralised computing and create zero-management access devices. Think of these as thin clients. To build these multimedia-enabled network computers, move the guts of todays personal computer (the high-end processor and the storage) to the server, and replace it with the innards of a mobile phone (with a low-cost processor and limited memory). The thick server delivers the virtual desktops to users over broadband connections.

    This server-centric computing model has many advantages. First, the access device can now be dramatically simplified and has the potential to reduce the cost to about Rs 3,000. (Keyboard, mouse and monitor would cost an additional Rs 4,000). Second, the computers require no maintenance and can now be easily bundled with the connectivity without the worry of house (or office) calls for support. Third, piracy will be eliminated since all software and content is delivered via the server, and can be controlled and monitored by the service provider.

    The second innovation needed is on the business model. Instead of asking users to make upfront investments, computing needs to become a utility available on a subscription basis for monthly payments. The pay-as-you-go model is what the world of mobile phone users and cable TV watchers is already very familiar with. This reduces the entry barrier dramatically for new users and provides a full solution at an affordable price.

    Using thin clients and server-centric computing, it should be possible for service providers to offer a bundle including broadband connectivity and support for no more than Rs 700 per user per month which is about what is paid most mobile phone users in urban India. This is the point where computing will take-off and spur the creation of a wide variety of services making broadband a catalyst of transformation across homes, offices and educational institutions.

    The next platform will consist of network computers as zero-management access devices, ubiquitous broadband networks, server-based computing and storage grids as the underlying infrastructure, centrally accessible services built around hosted software and content and utility-like subscription-based payment model. This is what will take the power of computing to the next billion users globally.

    This utility computing framework also provides the building blocks for a unified digital infrastructure capable of supporting computing, communications and entertainment, and facilitating the creation of next-generation utilities. Just like previous utilities which brought transportation, water, electricity and telecom to transform the lives of the masses, so also this utility has the potential to realise the hidden potential of todays forgotten masses not just in India but also in other emerging markets.

    India’s Rural Employment Guarantee Scheme

    The Economist writes:

    The appeal of poor-law relief, according to John Stuart Mill, was that it is available to everybody [but] it leaves to every one a strong motive to do without it if he can. The state does not test a person’s means, but by offering low wages in return for hard work it deters those who can support themselves. In theory, wages should be close to, but slightly below, the market wage. They should, as Mill said, give the greatest amount of needful help, with the smallest encouragement to undue reliance on it.

    The real value of the public works lies not in the orchards they plant, but in the safety net they provide. The rural poor, with little access to credit or insurance, have few opportunities to smooth their consumption in the face of misfortune. As a result, they tend to be conservative in their farming decisions, eschewing productive ventures that might raise the expected value of their income, but also raise its variance. Despite the clear risk of corruption by the officials administering it, an employment guarantee could give the poor something to fall back on. This might encourage farmers to take productive risks, such as experimenting with high-yielding seed varieties. It might spare them from distress sales of draught animals, keep them from pulling their children out of school and stop them from migrating to city slums.

    Employment guarantees pose a policy trilemma. In principle, the Indian government might like to achieve three goals: an unconditional guarantee of employment, at the minimum wage, without busting the budget. In practice, it can achieve any two of those three. If it offers above-market wages, it can contain the fiscal cost only by diluting the guarantee.

    The Firefox Explosion

    Wired has a cover story on the open-source browser:

    What makes Firefox different from other open source projects is its consumer appeal. Until now, the open source community has been very good at creating useful software but lousy at finding nontechnical users. By liberating Firefox from the “by geeks, for geeks” ethos, Ross and Goodger have moved open source out of server rooms and onto Microsoft’s turf: the desktop. Borrowing from the Net-based grassroots techniques of the recent political season, the Firefox inner circle has turned satisfied users into foot soldiers and missionaries. How’s this for a marketer’s dream: In the weeks following the debut, Firefox contributors and fans threw their own launch parties in 392 cities around the world.

    “People thought the browser wars were over,” Ross says, relishing the giant-killer role. “But now there’s a widespread perception that IE is not secure – and here we are.” What started out as one schoolboy’s exercise in minimalism, with a nod to Google’s back-to-basics obsession, has tapped into a growing desire for simplicity among ordinary computer users. “The success of this thing has totally surprised us,” Goodger adds. “Firefox has really touched a nerve.”

    What’s Hot in 2005

    Indexed Forever provides a view from some VCs:

    Last night I attended the Churchill Clubs Venture Capital: Whats Hot? Whats Not? On the panel were Jim Breyer of Accel Partners, Bill Gurley of Benchmark Capital,
    Joe Lacob of Kleiner Perkins Caufield & Byers and moderating the panel was yet another venture capitalist, Geoffrey Yang of Redpoint Ventures.

    After a superb comedic introduction by Yang, the panel settled into mostly violent agreement on the topics.

    Hot Digital Home, particularly component plays. Software with Open Source components or delivered as a managed service. Later stage deals in China.

    Not Packaged Enterprise Software. Storage. Semiconductors. Nanotech (for decade).

    Breyer restated his belief in content deals and peer-to-peer networks as he said at last years event. Oddly he suggested that distribution through retailers and PC OEMs was promising its 1994 all over.

    Gurley pushed his belief in the future of multiplayer gaming, mobile devices and security.

    Yang closed with a few thoughts. We are entering a bubble in internet media companies that have search or local in their business plans. We are entering a bubble in investing in China.

    The $100 PC

    The New York Times writes:

    Nicholas Negroponte, the technology guru from the Massachusetts Institute of Technology Media Laboratory, prowled the halls of the World Economic Forum holding the holy grail for crossing the digital divide: a mock-up of a $100 laptop computer.

    The machine is intriguing because Mr. Negroponte has struck upon a remarkably simple solution for lowering the price of the most costly part of a laptop – the display – to $25 or less.

    Mr. Negroponte said that he had found initial backing for his laptop plan from Advanced Micro Devices and said that he was in discussions with Google, Motorola, the News Corporation and Samsung for support.

    The device includes a tentlike pop-up display that will use the technology now used in today’s rear-projection televisions, in conjunction with an L.E.D. light source.

    Mr. Negroponte said his experience in giving children laptop computers in rural Cambodia had convinced him that low-cost machines would make a fundamental difference when broadly deployed.

    Red Herring writes:

    The low-cost computer will have a 14-inch color screen, AMD chips, and will run Linux software, Mr. Negroponte said during an interview Friday with Red Herring at the World Economic Forum in Davos, Switzerland. AMD is separately working on a cheap desktop computer for emerging markets. It will be sold to governments for wide distribution.

    Mr. Negroponte and his supporters are planning to create a company that would manufacture and market the new portable PCs, with MIT as one of the stakeholders. It is unclear precisely what role the other four companies will play, although Mr. Negroponte hopes News Corp. will help with satellite capacity.

    An engineering prototype is nearly ready, with alpha units expected by years end and real production around 18 months from now, he said. The portable PCs will be shipped directly to education ministries, with China first on the list. Only orders of 1 million or more units will be accepted.

    Om Malik adds: “I think this will be subsidized product, because even the back of the envelope calculations show that this cannot be built for $100.”

    TECH TALK: Microsoft, Bandwidth and Centralised Computing: The Arguments For Centralised Computing

    The central issue is not whether Microsoft should fear bandwidth. It is about centralised computing and whether it will take root or not. One point that needs to be kept in context is that most of the commentators on Mikes post are from the developed markets. My belief is that the computer world will see a schism with centralised computing (enabled by bandwidth) taking root increasingly amongst the next set of users and providing a solution that is both affordable and manageable. In the developed markets, there will be a trend towards greater remote management of desktops driven more by manageability than by affordability. More on this soon. First, let us consider the arguments for centralised computing.

    The primary reason why the discussion on centralising computing and remote applications is relevant is manageability. Todays desktops have become increasingly harder to administer with (home) users being lax in installing adequate protection against viruses and spyware, and forgetting to take backups. Enterprise users have less of a problem because of better protection at the perimeter in the form of well-configured firewalls. With centralised computing, the onus of management would shift away from users to service providers who would presumably automate much of the basic management of the desktop systems. Thus, the focus for centralised computing is not so much on the savvy10-20% of users, but on the other 80%.

    The other advantage of centralised computing would be in eliminating piracy. By (possibly) serving and managing applications centrally, users would be obliged to pay for the software, applications and content that they use. This would ensure that money flows back to the creators and could be especially useful for software vendors that serve emerging markets where piracy is rampant.

    Centralised computing could also ensure that users have access to their data and applications from anywhere. Instead of a personal computer, the focus shifts to personal data. This is especially useful in a world of multiple devices. Whether it is a browser on a computer or a virtual desktop on a thin client or a mobile phone, users would have access to their data anytime and from anywhere.

    What enables centralised computing is the availability of open-source applications. This can allow service providers to aggregate a complete stack for a much lower price than closed source and higher priced applications. In addition, centralising applications can also simplify providing updates and patch management.

    What centralised computing does is enable the notion of computer as an information appliance in the form of a thin client. It can also form the base for newer business models like those seen in telecom, with user fees based on monthly payments based on usage. As we shall see later, centralised computing has the potential to make computing a utility for the next billion users.

    Tomorrow: The Arguments Against Centralised Computing

    Continue reading