Browse Prior Art Database

Time-based aggregated sample clock face visualiser with zoom

IP.com Disclosure Number: IPCOM000022507D
Original Publication Date: 2004-Mar-18
Included in the Prior Art Database: 2004-Mar-18
Document File: 3 page(s) / 64K

Publishing Venue

IBM

Abstract

The clock face visualiser provides a straightforward and easy to use mechanism for quickly ascertaining the behaviour of an environment being monitored: from a macroscopic level, i.e. aggregated measurements, to a microscopic level, i.e. individual measurements. The visualiser can be used in applications ranging from monitoring middleware software to monitoring telemetric devices. The visualiser allows the user to move between levels using the temporal metaphor of a clock face. For periods of time where the behaviour of the environment is constant, this is illustrated using the same legend key for the duration of time where there is no change; for periods of time where the behaviour is changing, a different legend key is used. When the user moves in to the resolution of time where the behaviour during that time becomes constant, the first legend key is used.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 35% of the total text.

Page 1 of 3

Time-based aggregated sample clock face visualiser with zoom

Problem Statement

Monitoring and determining changes in the runtime attributes of middleware software can be obscured by the large number of objects defined in that system. For these objects, there are often a small number of attributes per object that change frequently compared to a large number of attributes that change rarely. Since virtually all object attributes can change (except for attributes such as time of creation, description, etcetera) it can be necessary to monitor all attributes of all objects to detect any changes. Some object attributes may change from one state to another and back again within a short period of time, and this predicates recording object attributes using a very short sampling interval (assuming that a native event is not provided). This can result in the need to record large volumes of data and can result in important data being buried underneath data of no importance: it can be difficult to see the wood for the trees.

Describe Solution

    The aim of this disclosure is to demonstrate a methodology which manages large volumes of data collected over time, and presents it in such a way as to identify a parentheses of time where a change occurred at a 'low level of resolution', and zooming-in on these parentheses resolves to a precise time where the change occurred at the 'highest level of resolution'. This is all achieved using the temporal metaphor of a clock face, which draws on a natural association of how recently an event occurred with respect to time, according to where the event is depicted on the clock face.

    The methodology presents a progressively higher resolution of the data the narrower the parenthesis of time examined becomes. The lowest resolution provides averages of the underlying data; the highest resolution reveals the individual sample data. A resolution level can either be derestricted, i.e. zoomed-out to a lower resolution, to summarise more data in a wider timeframe, or restricted to its underlying data, i.e. zoomed-in to examine the data that combines to provide the average samples depicted in the current resolution.

    The highest resolution is of a clock face showing sixty samples taken at one second intervals. These samples are averaged to provide per-minute averages depicted at the level of resolution layered on top of the per-second clock face. These averages are in turn combined to provide sixty per-hour averages. This determines that each per-hour average contains three thousand individual per-second samples, i.e. sixty per-minute averages each containing sixty per-second samples.

    The methodology simultaneously accommodates two capabilities from opposite ends of the data analysis spectrum: the first is to not discard any of the physical measurements of the system which would compromise the completeness of the data; the second capability is to provide markers that promote accurate and rapid identification (a funnellin...