Browse Prior Art Database

Processing persistent publish subscribe handling high volume of persistent publications.

IP.com Disclosure Number: IPCOM000029266D
Original Publication Date: 2004-Jun-21
Included in the Prior Art Database: 2004-Jun-21
Document File: 3 page(s) / 52K

Publishing Venue

IBM

Abstract

The present publication discloses a broker network for publishing and subscribing a high volume of data.

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 53% of the total text.

Page 1 of 3

Processing persistent publish subscribe handling high volume of persistent publications.

The present publication is directed to data processing and more specifically to a system and method for distributing messages from suppliers (called, hereinafter, "publishers") to consumers (called, hereinafter "subscribers") of data messages.

    Data processing systems for publishing/subscribing data have become very popular in recent years for distributing data messages from publishers to subscribers. In such systems, the publisher's application does not need to know the identity or location of the subscriber's application receiving the messages. The publisher only needs to be connected to a publish/subscribe distribution agent (the terms "distribution agent" and "broker" are used interchangeably herein) within a group of agents making up a broker network. The publisher sends messages to the distribution agent by specifying the subject (or "topic") of the message. The topic of a message specifies the interest that the producers and consumers have for this message. The topic is usually formatted as a string of characters, where subtopics are separated by a known separation character (such as the forward slash "/").

    A message can be persistent or not persistent. Non persistent messages are lost if no subscriber is interested in these messages. Persistent messages are archived by the broker in a permanent storage such as a relational database. In prior art, the topic is used as a global key. This leads to severe performance problems especially when subscribers and publishers are interested in selecting multiple different subtopics in complex topics. For instance, it is not possible with the current state of the art to handle 100,000 subscriptions with different topics for cases with mixed requirements pertaining to both Enterprise Application Integration (EAI) and Extract Transform and Load (ETL).

    The present publication provides a broker network capable of handling a high volume of persistent messages. The key element of the present publication is to store each subtopic of a given topic into a different column in a relational dababase table, where indexes can be used and the access optimized. As the number of subtopics is variable and subject to variations, The number of column for the subtopics cannot be predefined. In the majority of cases, the number of subtopics is less than 20, but the system according to the present publication is able to handle more subtopics. The second key element is the distribution of the subtopics in several tables, each table having a fixed number of columns, the overall number of tables providing the necessary storage area. To publish a new topic, if there is no...