Browse Prior Art Database

System and method to accelerate the accessing of website

IP.com Disclosure Number: IPCOM000238177D
Publication Date: 2014-Aug-07
Document File: 3 page(s) / 60K

Publishing Venue

The IP.com Prior Art Database

Abstract

Did you have experience of using some plug-in to buy train tickets during Spring Festival? Because of flood of access, server was easily crashed, web pages was suspended there, and tickets were sold out when you finally got response. Yes, the reason is, there are some static files such as CSS, JavaScript, images. Every clients retrieve the same files from sever, the same packages are transferring on the Internet. Especially, a huge count of requests was sent to the web server, as the result, a huge count of responses with the identical content will be sent back, the net bandwidth is wasted. Moreover, if the requests overhead the web server workload, many users will need to keep waiting for the responses, even worse some of them would lose their connections. In some conditions, web server will be crashed and stop its service. Currently, there are many approaches to accelerate static files loading: 1. File compression, compress js/css/image files in server side to reduce the request counts from client side. 2. Modern browsers are using cache/cache manifest in client side to reduce the requests for the same static files. 3. CDN. But these technologies can’t address the accessing like flood. The communication between server and clients are still there. Especially a huge number of new users are trying to use your website. The disclosure focuses on accelerating web access, so that website can handle flood access situation.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 56% of the total text.

Page 01 of 3

System and method to accelerate the accessing of website

The core idea for this disclosure is to reduce the requests of static file from client to web server.

For the static files, this paper cuts the connection from server to client, but makes clients share their static files, and uses geolocation to find the nearest neighborhood to increase their accessing speed.

For security purpose, website will provide a service to feed hash code of static files to clients, with the hash code browser can make sure the files that are retrieved from other peer are trustable.

[In Web Server side]

1. List all static files and calculate the hash code for these files.

2. Combine the static files name and their hash code information into a file, and calculate the hash code for the combined file which contains all static files information.

So there're two things important,

I) The combined file which contains all static file's hash code. (we name it D_File)

II) The hash code for the combined file. (we name it App_Code)

3. Server provides a service which tells client's nearest peer by geolocation, or by the testing of network latency.

[In Browser Client]

1. Under the condition, the website is accessed by its first user. That client retrieved the static files as normal, at the same time, it downloads the all hash code is stored combined file (D_File).

2. When a client try to access the same site,

I) It gets nearest neighborhoods (peers) and the App_Code from server, this can be comple...