Yes, according to McKinsey Quarterly (via News.com). It writes:
A total of $1.25 trillion was spent on new information technology systems from 1995 to 1999. Approximately $350 billion of this sum involved responses to extraordinary events, including Y2K investments, the growing penetration of personal computers in consumer and business markets (a phenomenon driven by a desire to access the Internet), and the creation of corporate networking infrastructures.
Moreover, the growing memory and speed requirements of new application software and of Microsoft’s Windows operating systems raised the frequency of computer upgrades.
But the tide has since turned. Over the next three to five years, fewer software upgrades and the near saturation of the consumer and, especially, the business markets could push growth below what it was before 1995. The inevitable result is declining productivity growth rates in computer manufacturing. Productivity growth will also slow in the semiconductor industry–though not as much, because of sustained international demand for microprocessors and other chips and of continued performance improvements.
I dont disagree with the facts above, but the conclusion is not correct. What is over is the party for companies selling new and expensive hardware and software. The world still has a huge population just getting exposed to computing, but they can only pay a tenth of what users in the developed markets have paid for. They need innovative computing solutions, but at their price points and for their needs.