Now Reading
Future internet aims to sever links with servers

Future internet aims to sever links with servers

131030-classic-opte-project-map-of-the-internet-2005-credit-curious-lee-flickr-attribution
via University of Cambridge

A revolutionary new architecture aims to make the internet more “social” by eliminating the need to connect to servers and enabling all content to be shared more efficiently.

One colleague asked me how, using this architecture, you would get to the server. The answer is: you don’t.

Dirk Trossen

Researchers have taken the first step towards a radical new architecture for the internet, which they claim will transform the way in which information is shared online, and make it faster and safer to use.

The prototype, which has been developed as part of an EU-funded project called “Pursuit”, is being put forward as a proof-of concept model for overhauling the existing structure of the internet’s IP layer, through which isolated networks are connected, or “internetworked”.

The Pursuit Internet would, according to its creators, enable a more socially-minded and intelligent system, in which users would be able to obtain information without needing direct access to the servers where content is initially stored.

Instead, individual computers would be able to copy and republish content on receipt, providing other users with the option to access data, or fragments of data, from a wide range of locations rather than the source itself. Essentially, the model would enable all online content to be shared in a manner emulating the “peer-to-peer” approach taken by some file-sharing sites, but on an unprecedented, internet-wide scale.

That would potentially make the internet faster, more efficient, and more capable of withstanding rapidly escalating levels of global user demand. It would also make information delivery almost immune to server crashes, and significantly enhance the ability of users to control access to their private information online.

While this would lead to an even wider dispersal of online materials than we experience now, however, the researchers behind the project also argue that by focusing on information rather than the web addresses (URLs) where it is stored, digital content would become more secure. They envisage that by making individual bits of data recognisable, that data could be “fingerprinted” to show that it comes from an authorised source.

Dr Dirk Trossen, a senior researcher at the University of Cambridge Computer Lab, and the technical manager for Pursuit, said: “The current internet architecture is based on the idea that one computer calls another, with packets of information moving between them, from end to end. As users, however, we aren’t interested in the storage location or connecting the endpoints. What we want is the stuff that lives there.”

See Also

“Our system focuses on the way in which society itself uses the internet to get hold of that content. It puts information first. One colleague asked me how, using this architecture, you would get to the server. The answer is: you don’t. The only reason we care about web addresses and servers now is because the people who designed the network tell us that we need to. What we are really after is content and information.”

Read more . . .

 

 

Go deeper with Bing News on:
Pursuit Internet
Go deeper with Google Headlines on:
Pursuit Internet

[google_news title=”” keyword=”Pursuit Internet” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]

Go deeper with Bing News on:
Radical new architecture for the internet
Go deeper with Google Headlines on:
Radical new architecture for the internet

[google_news title=”” keyword=”radical new architecture for the internet” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]

What's Your Reaction?
Don't Like it!
0
I Like it!
0
Scroll To Top