Browse Prior Art Database

COMPUTATIONAL COMPLEXITY OF RANDOM ACCESS STORED PROGRAM MACHINES

IP.com Disclosure Number: IPCOM000148206D
Original Publication Date: 1970-Aug-31
Included in the Prior Art Database: 2007-Mar-29
Document File: 36 page(s) / 1M

Publishing Venue

Software Patent Institute

Related People

Hartmanis, J.: AUTHOR [+2]

Abstract

COMPUTATIONAL COMPLEXITY OF RANDOM ACCESS STORED PROGXAM MACHINES J. Hartmanis Technical Report NO. 70-70 August 1970 Department of Computer Science Cornell UniversityIthaca, New York 14850 COMf UTATLONAL COMPLEXITY OF RANDOM ACCESS STORED PROGRAM MACHINES J. Hartmanis ABSTRACT. In this paper we explore the computational complexity measure defined by running times oE programs on random access stored program machines, RASP'S. The purpose of this work isto study more realistic complexity measures and to provide a setting and some techniques to explore different computer organizations. The more interesting results of this paperare obtained by an argument about the size of the computed functions. For example, we show (without using diagonalization) that there exist arbitrarily complex functions with optimalRASP programs whose running time, cannot be improved by any multiplicative constant. We show, furthermore, that these optimal programs cannot be fixed procedures and determine the difference in computation speed between fixed proceduresand self-modifying programs. The same technique is used to compare computation speed of machines with and without builtin multiplication. We conclude the paper with a look at machines

This text was extracted from a PDF file.
At least one non-text object (such as an image or picture) has been suppressed.
This is the abbreviated version, containing approximately 11% of the total text.

Page 1 of 36

     COMPUTATIONAL COMPLEXITY OF
RANDOM ACCESS STORED PROGXAM MACHINES

J. Hartmanis

Technical Report

 NO. 70-70
August 1970

Department of Computer Science
Cornell University
Ithaca, New York 14850

[This page contains 1 picture or other non-text object]

Page 2 of 36

     COMf UTATLONAL COMPLEXITY OF
RANDOM ACCESS STORED PROGRAM MACHINES

J. Hartmanis

      In this paper we explore the computational complexity
measure defined by running times oE programs on random access
stored program machines, RASP'S. The purpose of this work is
to study more realistic complexity measures and to provide a
setting and some techniques to explore different computer
organizations. The more interesting results of this paper
are obtained by an argument about the size of the computed
functions. For example, we show (without using diagonalization)
that there exist arbitrarily complex functions with optimal
RASP programs whose running time, cannot be improved by any
multiplicative constant. We show, furthermore, that these
optimal programs cannot be fixed procedures and determine the
difference in computation speed between fixed procedures
and self-modifying programs. The same technique is used to
compare computation speed of machines with and without built
in multiplication. We conclude the paper with a look at machines

with associative memory and distributed logic machines.

ABSTRACT.

[This page contains 1 picture or other non-text object]

Page 3 of 36

INTRODUCTION.

    The purpose of this paper is to study the quantitative
problems about computation speed of random access stored pro-
gram machines, called RASP'S. These abstract machines werP
defined [1,2] to provide theoretical models which look more
like real computers and which reflect some aspects of real
computing more directly than previously studied Turing machines.

In the following section we give an informal definition of

random access stored program machines and derive some elemen-
tary results about the computational complexity measure based
on the number of operations used by RASP'S. Next, by a new
technique involving the size of the computed functions, we
study the problem of how well we can define and approach the
best possible computation time with RASP programs. It is
shown that there exist arbitrarily complex computations for
which we can write programs whose computation time cannot be
improved by any factor. That is, there exist functions fi(n)

which can be computed in time Ti(n) but cannot be computed in

time (1 - &)Ti(n) for any E > 0 . This result contrasts

sharply the corresponding results for time-limited, tape-
limited or reversal-limited Turing machine computations 1 3 , 4 , 5,6] all of which can be improved by a constant factor. It

is also interesting to observe that this result is obtained
without using a diagonal argument which up until now was the
only technique to prove results of this type. Clearly, some

[This p...