Browse Prior Art Database

Post-Mortem verification Disclosure Number: IPCOM000227949D
Publication Date: 2013-May-30
Document File: 6 page(s) / 54K

Publishing Venue

The Prior Art Database


Disclosed is the technique of Postmortem Verification, which solves the problem of hard to debug issues as a side effect, by allowing these issues to be identified post-runtime. Postmortem Verification is run when an application has terminated with a core dump or when core dumps are generated in order to check for potential issues. Using the information in the core dump, Postmortem Verification completes either similar checks, which would have been completed at runtime, or checks that are more aggressive.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 20% of the total text.

Page 01 of 6

Post-Mortem verification

Java* uses classes as the binary representation for the programs it executes. Classes may come from files (class files), may be downloaded from a remote source, or may be dynamically generated/modified. There are specifications for the structure of classes as well as rules that a properly formed class must comply with (class file format). The rules go beyond simple structure checks and include validation that when the code in the class is executed it will not exceed limits (for example stack overflows or underflows), access resources it should not be able to (for example stores to variables are valid) and that method parameters are of the correct types.

While it is expected that compilers will generate valid class files, a Java Virtual Machine (JVM) could end up running invalid class files in a number of circumstances:

• Class file version mismatch. A newer version of a class file (c1-new) for one class is used with a class file (c2) that was compiled with an older version of the class file (c1-old). Changes in signatures/fields in c1-new may cause an incompatibility with c2.

• Bugs in code that dynamically generates or modifies class files. For example, the JVMTI interface allows the dynamic transformation of class files .

• Malicious attacks. Some of the class file format rules are designed to ensure that Java code cannot violate security constraints. Class files could be constructed which violate these rules in an attempt to circumvent this protection.

• Compiler bugs

Verification is the process by which the JVM validates that the classes it is asked to execute conform to the specification and rules. JVMs generally verify classes that are considered untrustworthy. For example, a class shipped as part of the JVM itself may be trusted, while a class downloaded from a remote source may not be. JVMs tend to be conservative with trusted classes, even when malicious attacks are unlikely, because the bugs caught by verification would otherwise be hard to debug and identify when they occur later. Intermittent strange behavior or crashes would be common outcomes.

Unfortunately, the runtime cost of verification is significant. Another data point is

that when customers run a large number of JVMs in an environment where the Central Processing Unit (CPU) is over-committed, the JVM restarts must be staggered because of the startup CPU cost, much of which is associated with class loading and verification. Also, demonstrating the desire to reduce verification costs, by default most JVM implementations do not complete all of the checks that are possible, requiring -Xfuture on the command line to add back the missing checks .

A method is needed to eliminate the problem of hard to debug issues that present as side effects of reducing the amount of verification completed . This would allow JVMs to be more aggressive about the classes they trust. The net results would be real benefits in terms of startup time and C...