InnovationQ will be updated on Sunday, Oct. 22, from 10am ET - noon. You may experience brief service interruptions during that time.
Browse Prior Art Database

A System and Method for securely distributing dump files

IP.com Disclosure Number: IPCOM000247096D
Publication Date: 2016-Aug-05
Document File: 9 page(s) / 242K

Publishing Venue

The IP.com Prior Art Database


A system and method is provided for distributing dump file with segments in a securely way. The system contains an agent component and a management server. The agent component can discover new dump files generated on the target system and generate related segments metadata. The Agent component will send messages to the management server for generating segment access links and authentication tokens. The management server will distribute the access links and tokens to target dump analyzers. Analyzer can retrieve the dump segments on demand during the process of problem diagnosis.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 29% of the total text.

Page 01 of 9

A System and Method for securely distributing dump files

When an application is crashed, a core dump file is generated to include information such as data in registers, the program's call stacks etc for problem investigation. When the operation system failed to respond, the core dump will also be generated to include as much as more information for future investigate. The core dump file is a little large. The remote debug technology is employed to investigate the core file,

which requires a debug proxy running on the server where the core dump file sits and the debugger communicates with the proxy to do actual debugging.

(In the above Figure, TCP is an acronym for Transmission Control Protocol.)

Another option is to copy the core dump file to a debugger system and doing the debugging locally, this causes huge network bandwidth consuming when the core dump file is huge. Some other similar prior arts incudes create the core dump file to the debugger system directly, in a clustered environment or the 2 systems are connected by a serial bus. Improvements to this kinds of invention are to create a mini dump file to reduce the size of core dump file.

To protect the data in the core dump file, the basic mechanism is to obfuscate the data in memory with encryption keys. Here, a new way is created to protect the sensitive data in the core dump.

In order to address the above problems, this invention disclose a method to divide dump files and add a branch of management flow. By this invention, a tester/analyzer can debugger the dump files which is out of access concisely without tedious dump files archived process.


Page 02 of 9

The major invention points are depicted as follows. When a tester need to debug the dump files out of access, the system keeping dump files


1. The system add an agent component to analyze the data structure of dump files. And dump files will be decomposed into a key field and a pointer list contains data segments. This key field summarizes the identity of dump file and will be sent to another management application automatically.

2. The management server only offer management flow services. It generates and broadcasts a link which can access key field in a field list. During accessing key field, when a DEV/tester(DEV is a shorthand for Developer, in the rest of this document, DEV stands for Developer)


Page 03 of 9

identify the dump file and request to one or more data segments of it, token will be forwarded by agent and should be validated by Management Server.

3. The dump file is available to the DEV/tester by special agent. DEV/tester can utilize the data segments and capture as many parts of dump files as want in a period of time.

Novel Points:

a. Dump file segmentation: the dump file is managed by the segment to reduce data transferring and enhance access control. Agent which runs on the original system utilizes the data structure of dump files and generates a reference list which contains some data segm...