A study claims that Internet performance could start to decline by 2010 due to a growing gap between access capacity and demand.
Nemertes Research estimates that up to US$55 billion needs to be spent to close that gap, or about 60 per cent to 70 per cent more than service providers intend to spend.
"The primary impact of the lack of investment will be to throttle innovation both the technical innovation that leads to increasingly newer and better applications, and the business innovation that relies on those technical innovations and applications to generate value," said the report released Tuesday.
"The next Google, YouTube, or Amazon might not arise, not because of a lack of demand, but due to an inability to fulfill that demand. Rather like osteoporosis, the underinvestment in infrastructure will painlessly and invisibly leach competitiveness out of the economy."
A University of Toronto computer science professor told CTV.ca that while he didn't analyze the report's details, he agrees with its general thrust about a looming Internet slowdown.
"This is an inevitability, whether it's 2010 or 2012," said Eugene Fiume.
"This was predictable in the 1980s," he said.
The exploding use of the Internet in emerging economies like China and India will create "hotspots" within the distributed network that is the Internet, he said.
Data will slow down in these hotspots, much like how traffic slows at a poorly designed city intersection. "You will eventually see the not-so-graceful degradation of the entire system," Fiume said.
Technology analyst Kris Abel told Â鶹ӰÊÓnet that the study may be making too many assumptions, that it's difficult to predict the future and new technologies could offset some of the concerns -- which have been widely known for some time -- raised in the study.
"They look specifically at just wired services," he said.
"We're living in an age now where increasingly we are getting a lot of our internet service through wireless solutions, and in wireless solutions, you don't have the same problems."
Backbone vs. the last mile
Nemertes said it analyzed consumer demand and capacity independently.
Some say the problem isn't with the core backbone of the Internet, but where service providers provide access to consumers -- what the telecommunications sector calls "the last mile."
Fiume said that's partially true, but added, "that's what telcos want you to believe, because that's pushing the problem onto the consumer."
Internet service providers can already regulate those who hog bandwidth by engaging in extensive use of peer-to-peer file-sharing networks, as one example, he said.
The Internet's overall pipes need to be widened, along with improving the efficiency of the rules by which data "packets" are transmitted, he said.
"The telcos didn't plan well enough to deal with the explosion of information content on the Internet writ large," Fiume said, adding, "they trying to make it seem like the fact you're watching YouTube is really causing the problem. That's really very laughable."
Nemertes, which didn't make a spokesperson available to CTV.ca, said no one group funded the study and that funding for it came from its client base.
The data came from several sources:
- Research data collected by academic organizations
- Publicly available documents, including vendor and service provider financials
- Confidential interviews with enterprise organizations, equipment vendors, service providers, and investment companies
"During the course of this project we spoke with 70-plus individuals and organizations for these interviews, and we relied on our base of several hundred IT executives who participate in our enterprise benchmarks," the report said.
However, the group said the Internet remains an exceedingly opaque environment.
"Content providers refuse to reveal their inner workings. This is often for very good reasons, but it's detrimental to the industry," it said.
The group called for industry to develop ways of better sharing data with researchers.