Browse Prior Art Database

Automated Test Framework for ‘UML Model Generation Tools’ characterized by unique UML model generated for constant input directives.

IP.com Disclosure Number: IPCOM000212332D
Publication Date: 2011-Nov-07
Document File: 5 page(s) / 55K

Publishing Venue

The IP.com Prior Art Database

Abstract

Disclosed is a method to validate the behavior/code of Unified Modeling Language (UML) Development Tools that produce unique output at different instances of time for the same code-based UML Development tool and a constant input. The approach ascertains code sanity and, in turn, consistency in the characteristics of the UML development tool in successive iterations of the development of this tool by leveraging an automated test framework that would compare the unique UML model that is output at any instance of time, for an input directive, against a previously validated output produced for the same input directive.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 23% of the total text.

Page 01 of 5

Automated Test Framework for 'UML Model Generation Tools' characterized by unique UML model generated for constant input directives .

The invention resolves the issue of being able to validate the behavior/code of Unified Modeling Language (UML) Development Tools that produce unique output at different instances of time for the same code-based UML Development tool and a constant input. Since the output models are always unique, the challenge is to be able to validate the behavior/code of the Tool based on the output UML models generated. Thus, although the models are unique, the common approach is to leverage off tools which match their salient constituent elements to ascertain whether these unique models are equivalent. Based on this equivalence, the problem of validating and detecting modification in behavior/code which manifests itself can be solved by producing models which are unique but non-equivalent.

Based on certain input directives, UML models can be generated and consequently stored in Extensible Markup Language (XML) based XML Metadata Interchange/Enhanced Metafile (XMI/EMF) format. The generation of these UML models in EMF format involves each of the constituent UML artifacts being assigned a unique token referred to as Globally Unique Identifiers (GUIDs) that serve in inter-element referencing apart from uniquely distinguishing each element. This generation of unique ids for elements compels the output produced every time to be different from the previous output produced for the same input directive at a separate instance of time. Hence, semantically, the models are equivalent although at a low level the American Standard Code for Information Exchange (ASCII) diff produced for the model files are different.

The issue at hand is to be able to automatically validate code that evolves or expands a UML model generation tool by comparing the unique output produced for a given constant set of input directives at different instances of time. A solution to this problem would increase code productivity since an automated test framework would be able to ascertain that no unforeseen deltas have been introduced to the output and in turn that the code functionality and characteristics have not accidentally changed during further development of the tool.

JUnit is a powerful technique on which one could leverage to automate one's testing. JUnit is a philosophy in its own right, but still needs a direction in terms of what is it that needs to be tested. Following are two of the contemporary approaches which demonstrate that the code and, in turn, the tool's characteristics have not changed.

The first approach present is a conventional one. This approach is prevalent in many development environments including this where JUnits specifically test for the presence of a particular UML element and its features. Open source Application Programming Interfaces (API) exist for these, including those whereby the presence of UML elements, as w...