Dismiss
InnovationQ will be updated on Sunday, Oct. 22, from 10am ET - noon. You may experience brief service interruptions during that time.
Browse Prior Art Database

TWO APPROACHES TO INTERPRO'CEDURAL DATA FLOW ANALYSIS READ'ING ROOM

IP.com Disclosure Number: IPCOM000128229D
Original Publication Date: 1978-Dec-31
Included in the Prior Art Database: 2005-Sep-15

Publishing Venue

Software Patent Institute

Related People

MICHA SHARIR: AUTHOR [+4]

Abstract

Under the general heading of Program Analysis we can find today two disciplines which even though. having similar aims differ in the means and tools they apply to the task of analysis. The first is the discipline associated with Program Verification. This is usually presented as the process of finding invariants of the program or in other words fully characterizing the behavior of the program, discovering all the properties of all possible execu-tions ([MA], [CO]). As such it is extremely ambitious and hence a priori doomed to failure on theoretical grounds for all but the most restricted program models. The second discipline falling under the name of Program Analysis is the more pragmatically oriented Data Flow Analysis. Associated with Optimizing Compilers, this methodology is very much concerned with questions of effectiveness and efficiency, in' particular the tradeoff between effort :invested and the increment in the quality of produced code, gained. Quite understandably, its objectives are more modest. The. reduced ambitiousness is expressed in not trying to extract all properties of the program but concen-trating on several simple well defined properties such as the avail-ability of expressions, the types and attributes of dynamic values, the constancy of variables etc. A basic technique used to analyze procedureless programs (or single procedures) is to transform them into a flow graph [AL1] and assume that all paths in this graph can represent actual executions of the program. S

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 6% of the total text.

Page 1 of 46

THIS DOCUMENT IS AN APPROXIMATE REPRESENTATION OF THE ORIGINAL.

TWO APPROACHES TO INTERPRO'CEDURAL DATA FLOW ANALYSIS READ'ING ROOM

MICHA SHARIR and AMIR PNUEZI

September 19 7 8

deport No. 002 a

~1~ Courant Institute of Mathematical Sciences, New York University. Work by this author has been supported by NSF Grant MCS-76-00116 and USD4E Office of Energy Research Contract EY-76-C-02-3077.

University of Pennsylvania and Tel '-,!,viv University . Introduction

Under the general heading of Program Analysis we can find today two disciplines which even though. having similar aims differ in the means and tools they apply to the task of analysis. The first is the discipline associated with Program Verification. This is usually presented as the process of finding invariants of the program or in other words fully characterizing the behavior of the program, discovering all the properties of all possible execu-tions ([MA], [CO]). As such it is extremely ambitious and hence a priori doomed to failure on theoretical grounds for all but the most restricted program models. The second discipline falling under the name of Program Analysis is the more pragmatically oriented Data Flow Analysis. Associated with Optimizing Compilers, this methodology is very much concerned with questions of effectiveness and efficiency, in' particular the tradeoff between effort :invested and the increment in the quality of produced code, gained. Quite understandably, its objectives are more modest. The. reduced ambitiousness is expressed in not trying to extract all properties of the program but concen- trating on several simple well defined properties such as the avail-ability of expressions, the types and attributes of dynamic values, the constancy of variables etc. A basic technique used to analyze procedureless programs (or single procedures) is to transform them into a flow graph [AL1] and assume that all paths in this graph can represent actual executions of the program. This model does not describe the "true" .2

run-time situation correctly,and in fact most of the graph paths are not feasible, i.e. do not represent possible executions of the program. However, this model is widely adopted for two main reasons: (a) Its relatively simple structure enables us to develop a comprehensive analytic theory, to construct simple algorithms which perform the required program analysis and to investigate general properties of these algorithms in detail (cf. [HE], [AU) for recent surveys of the subject). (b) Isolation of feasible paths from non-feasible ones is known.to be an undecidable problem, closely related to the Turing machine halting problem. This classical technique faces significant problems in the presence of procedures. These problems reflect the dependence of individual inter-procedural branches upon each other during program execution, a dependence which is known at compile time and is essentially independent of any computation performed during that execution. I...