Scandoil  

The Sequel

Published Dec 12, 2003
[an error occurred while processing this directive]

Edit page New page Hide edit links

Internet

The Internet has become a global marketplace, an information tool, and a community. But some people think it should be more than it is today – more immersive, more flexible, more everything. Cue the sequel: Internet 2.

The Sequel-Link

Map of Internet 2 International partners

It’s been twelve years since Tim Berners Lee created the protocol that governs the World Wide Web. He intended it as an information network for physicists worldwide, but it soon became evident that his creation had much wider application. The Web was made universally available, and the computer network known as the Internet started spreading as though it was spun by a very industrious spider.

In 1993 the first graphical browser was released. Marc Andreesen, who later founded Netscape Communications, called it Mosaic. The first graphical web interface was like a grainy silent movie compared to the special effects and sleek, streamlined graphics that characterize the Net in 2003. But some people see the Internet as too limited, unable to handle the demands of the future.

Second Coming
Internet 2 is a consortium being led by 202 universities working in partnership with industry and government resources to develop a network that can fulfil the needs of a world that will be constantly online, transferring huge amounts of data at enormous speeds. The Internet we know was created for a dial-up regime – analog phone lines with very limited bitrates that are connected to the network for a specific time.

As broadband access proliferates across the world, and with it, higher transfer speeds and amounts of data, it is clear that the very infrastructure of the Internet has to be reworked. This does not mean that a separate Internet will appear alongside the one we already have. Internet 2 aims to deploy new and improved network technologies on the existing Net. As they state in their FAQ: “Just as email and the World Wide Web are legacies of earlier investments in academic and federal research networks, the legacy of Internet2 will be to expand the possibilities of the broader Internet”.

Based in a partnership between industry and academia, Internet 2 has several aims. One is to enable revolutionary new applications that cannot be accommodated on today’s Net. This includes new ways of organizing large amounts of information, such as digital libraries, with indexing and search capabilities not possible today. Another is the development of applications that support distance-independent learning, and information-sharing in new and efficient ways. According to the consortium, the medical professionals of the future will be able to perform remote diagnoses based on videoconferencing and sensory translation applications. Scientist will be able to work in virtual laboratories. But all of this is dependent on speed.

Need for Speed
We are used to waiting for it: The hourglass icon, and the slowly creeping progress indicator, are familiar images. We expect choppy video streams, even though we find them tremendously irritating. But the problem of speed is not all about customer satisfaction – it is a prerequisite for many of the services that will populate the Net and change communications in the years to come.

“The Internet 2 backbone in the United States moves billions of bits of data per second, 300,000 times faster than the connection we have at home,” Ted Hanss, Director of Internet 2, told CNN recently. Not long ago, the Internet 2 network hosted the fastest data transmission in history, when researchers transferred 6.7 gigabytes of data – roughly the equivalent of two full-length DVD movies – across 10,978 kilometres, from Sunnyvale, USA to Amsterdam in Holland, in less than one minute. That averages about 923 megabits per second.

A very good home ADSL line today seldom boasts more than 2 megabits per second. Sound interesting? Then you’ll be happy to know that several different research teams – all part of the Internet 2 collaboration – are working to break that record every day. The competitive nature of the scientific community promises to push transfer speed to its limits.

This exponential increase in transfer rates is crucial to the applications that will make the Net a different experience in the future. Everything is dependent on mega-bandwidth, most importantly the phenomenon known as tele-immersion.

Step Into The Screen
Tele-immersion proposes to be what virtual reality never became. The technology aims to enable users at geographically distributed sites to collaborate in real time in a shared, simulated environment as if they were in the same physical room. This is a very different paradigm for human-computer interaction than the one that governs the current way of using computers. And it poses the greatest technical challenge for Internet 2.

This revolutionary technology will combine virtual reality and videoconferencing to create online simulated rooms. Sound familiar? Virtual reality was a big buzzword in the mid-nineties, but never lived up to its initial promise, largely due to a lack of supporting infrastructure. Now the Internet 2 consortium wants to bring it back, to create a fully immersive Internet environment.

The applications are many, and promise great changes in many fields of society. Distance education becomes a completely different thing in a virtual classroom, where students sit together as if in a real-world auditorium. The dissection of human bodies – essential to anatomy lessons for medical students – can be replaced by simulated dissections on virtual corpses. This has already been implemented at the University of Wisconsin at La Crosse – the Wisconsin Digital Cadaver.

The Backbone
Internet2 users enjoy true tele-presence – the ability to be simultaneously both here and there via crystal-clear video and digital stereo sound. The difference between surfing on the Internet 2 network and on a conventional dial-up connection is like the difference between riding a rusty flat-tired bicycle and flying a Concorde jet.

The reason Internet 2 users are able to fly the Concorde is the unique infrastructure that supports the network. Internet2 is deployed largely over two high-performance backbone networks called Abilene and vBNS+. Data speeds down fiber-optic lines to universities, government agencies, and other places that are connected to one of 30 high-capacity networking centers, each known as a gigaPoP, short for gigabit point of presence. These giant network hubs can handle immense throughput of data.

In a windowless room at the Abilene Network Operations Center at Indiana University in Indianapolis, arrows representing data transmission pulse and glow in brilliant colors across a digital map of the United States, indicating Abilene's traffic flow. "We're talking about 2.4 gigabits per second through these circuits," Steve Peck, the Network Operations Center manager told Discover Magazine. An individual application on Internet2 can use up the entire 2.4 gigabits. That's around 43,000 times more data than can flow through a standard 56K dial-up Internet connection.

Challenges
Participating in the network costs more than half a million U.S. dollars, but the universities and other partners who are connected don’t see this as expensive. And so far, the Internet 2 project has evaded the lack of financing that has plagued research efforts since the dotcom bubble burst into a fine mist. The main reason is that it is largely financed through academic and public funding. Another is that it is very hard to dispute the marketability of the applications developed through Internet 2. Some of them, such as high-definition video streams, could reach the market within the next three years.

The biggest challenge to bringing the next Internet to consumers, is the hodgepodge mix of copper wires, coaxial cables, and fiber optics that hold the Internet together. Cleaning up the mess has been “more trouble than we envisioned,” one Internet 2 official concedes. The other main hindrance is the reluctance of telcos and content providers to launch high-end services in a volatile economic environment. Consumers won’t get to see the good stuff until network operators find new confidence in the Internet market. Let’s hope that confidence returns very soon.




Bookmark and Share


Do you have any comments to this articel, please let us now:

Do you have any comments to this articel, please let us know:

Please be civil.

(Use Markdown for formatting.)

This question helps prevent spam:

 


 

 


RSS

RSS
Newsletter
Newsletter
Mobile News
Mobile news

Computer
Our news on
your website


Facebook
Facebook
Twitter
Twitter

Contact
Contact
Tips
Do you have any
tips to us

 

sitemap xml