Browse Prior Art Database

Database tables decomposition and re-aggregation with data fogging and security purposes

IP.com Disclosure Number: IPCOM000200475D
Publication Date: 2010-Oct-15
Document File: 3 page(s) / 79K

Publishing Venue

The IP.com Prior Art Database

Abstract

This article describes a possible method that adds security to stored database tables by decomposing the original table so that sensible data is split from commonly known data like personal identification. In addition, the same method is the basis to inject fake data (fogging) and automatically identify and filter it when authorized operations are performed. This articles take advantage of two patents previously unrelated.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 51% of the total text.

Page 01 of 3

Database tables decomposition and re -aggregation with data fogging and security purposes

Data theft is happening everyday and unfortunately even if sensitive data are frequently encrypted, once data are stolen, brute force attacks can overcome those protections if the prize is worth the effort. This situation will grow as the technology and economics push toward cloud computing.

Existing methods to encrypt external keys link across tables using a client-server mode makes data reconstruction very difficult.

These existing methods can be improved in 2 ways:
1) for a real performance usage it is better to use a server side mechanism as much as possible so that the function can be internalized in the DB routines
2) in addition to data decoupling it would be nice to have a single mechanism that can add fake data (fogging) and concurrently understand what is the fog in order to increase any unauthorized data reconstruction process.

This article proposes to use for instance existing algorithms for key generation using random number initialization (LK method in short along this document) and which mix the benefit of fogging and server side usage in a single mechanism.

Let's explain how it may work in an example for a single table. Extension to multiple tables is straightforward.

Assume we have a table like the following:

(This page contains 00 pictures or other non-text object)

An analisys (human or automated is not relevant for our goal) is performed about what is sensitive and we decide it's "num.conto" , "saldo", "fido", and "custcode". So we split the original table to segregate sensitive data from known personal data as follows. In the process we shuffle the rows in each split table and add some lines of fake but realistic data (using an existing tool able to transform confidential information by generating fake data)

(This page contains 01 pictures or other non-text object)

(This page contains 02 pictures or other non-text object)

    If we wouldn't know the original table, then it would be impossible to understand what is the person with 7656,89 as "saldo".

Next step is to add a method for the database to reconstruct the original record as needed. Existing methods describe a generic encryption algorithm that encrypt the external key of one table to the other. In...