Browse Prior Art Database

Supply Driven Networks

IP.com Disclosure Number: IPCOM000123999D
Original Publication Date: 1999-Sep-01
Included in the Prior Art Database: 2005-Apr-05
Document File: 2 page(s) / 85K

Publishing Venue

IBM

Related People

Doyle, R: AUTHOR

Abstract

The current method of information dissemination on the Internet is based on the request of a consumer. When a user wishes to view information from a web site, a request is sent on the Internet to access the data. Caches are used throughout the Internet to reduce the latency of subsequent requests, but these caches are populated by the first request for the data.

This text was extracted from an ASCII text file.
This is the abbreviated version, containing approximately 52% of the total text.

Supply Driven Networks

The current method of information dissemination on the
Internet is based on the request of a consumer.  When a user wishes
to view information from a web site, a request is sent on the
Internet to access the data.  Caches are used throughout the Internet
to reduce the latency of subsequent requests, but these caches are
populated by the first request for the data.

   Given the improvement in network speeds and memory sizes,
it can easily be predicted that technology in the next five to ten
years will allow for network backbone speeds of hundreds of gigabits
with main memory capacities well into the gigabit range.  Disk
capacities will be in the terabyte range.  With this increase in
network speeds and caching capacities, it will be possible to support
a change in the information flow model of the Internet.

   The goal of this idea is to utilize the faster networks
to make a tradeoff for reduced latency while increasing traffic.  The
scarce resource of the future will not be memory capacity or network
bandwidth, but it will be human attention.  With the overabundance of
information on the Internet, users don't want to wait for info to be
downloaded.

   Internet sites that produce information (CNN, ESPN, Wall
Street, etc) will change from a periodic update of replicated sites
to a supply driven model that sends out the new content from these
sites as soon as it is available.  The information is sent (on
supply) via IP multicast to replication sites and subscribed ISP's so
these repositories always have the latest information available from
the data sources.  There are no longer requests sent across the
Internet based on user queries.  The majority of traffic on the
backbone of the Internet would be data flowing from sources to
repositories.  ISP and replication sites are always certain that they
have the latest information and return this to the user.  Us...