How to use JTReg… - Java Regression Test Harness ?
4 stars based on
This FAQ is jtreg binary systems growing list of questions asked by developers writing tests which will run using the regression extensions to the JavaTest harness, version 3. It is a supplement to the test-tag language specification and is intended to illuminate implications of the spec and to answer questions about this implementation of the spec. The test framework jtreg binary systems here jtreg binary systems intended primarily for unit and regression tests.
It can also be used for functional tests, and jtreg binary systems simple product tests -- in other words, just about any type of test except a conformance test. Tests can often be written as small standalone Java programs, although in some cases an applet or a shell-script might jtreg binary systems required. A regression test is a test written specifically to check that a bug has been fixed and remains fixed. A regression test should fail when run against a build with the bug in question, and pass when run against a build in which the bug has been fixed.
Suppose, for example, that you jtreg binary systems a bug that turns out not to be a bug, but in the course of jtreg binary systems the evaluation you write a test that checks jtreg binary systems correct behavior. You can help improve the quality of the JDK by checking this jtreg binary systems into jtreg binary systems test directory. The JavaTest harness is a set of tools designed to jtreg binary systems test programs. Among other things, the harness has evolved the ability to execute non-JCK testsuites.
The JDK regression test suite is one such suite. The JavaTest harness, version 3. Henceforth, in this document, the JavaTest harness, version 3.
For the harness to execute tests in a given test suite, it needs specialized code which knows how to find test descriptions and how to interpret those descriptions. Tag Language Specification provides the needed descriptions. Originally, jtreg referred to a custom shell script that was used to invoke a custom entry point to the JavaTest harness. The script is no longer required, and the name jtreg is simply a popular equivalent for the name regtest.
It is recommended that you run jtreg on a platform that has been certified as Java Compatible. It requires a version equivalent to JDK 1. For Windows systems, the regression extensions require the installation of the MKS Toolkitversion 6. See the OpenJDK page for a suitable forum or mailing list. JUnit was not around when we started writing tests for JDK. And, the test tag specification has been specifically designed for testing JDK, with support for testing applets, command-line interfaces, and so on, as well as simple API tests.
And by now, there are many thousands of tests written for jtreg, so it would not be practical to convert to JUnit. The simplest test is an ordinary Java program with the usual static main method. If the test fails, it should throw an exception; if it succeeds, it should return normally. The test tag identifies a source file that defines a test. If necessary the harness will compile the source file, if the class files are older than the corresponding source files.
Other files which the test depends on must be specified with the run build action. The arguments to the jtreg binary systems tag are ignored by the harness. The bug tag should be followed by one or more bug numbers, separated by spaces. The bug number is useful in diagnosing test failures. It's OK to write tests that don't have bug numbers, but if you're writing a test for a specific bug please include its number in an bug tag.
The summary tag describes the condition that is checked by the jtreg binary systems. It is especially useful for non-regression tests, which by definition don't have bug numbers, but even if there's a bug number it's helpful to include a summary. Note that a test summary is generally not the same thing as a Bugtraq synopsis, since the latter describes the bug rather than the condition that the bug violates.
The arguments of a tag are the words between that tag and the jtreg binary systems tag, if there is one, or the end jtreg binary systems the comment enclosing the tags. In cases like the above example in which you must check a condition in order to decide whether jtreg binary systems test is to pass or fail, you have no choice but to construct an exception.
RuntimeException is a convenient choice since it's unchecked, so you don't have to sprinkle your code with throws clauses. On the other hand, if the test would naturally cause an exception to be thrown when it fails, it suffices to let that exception be propagated up through the main method.
If the exception you expect to be thrown in the failure case is a checked exception, then you'll need to provide the appropriate throws clauses in your code. In general, the advantage of throwing your own exception is that often you can provide better diagnostics.
It is strongly recommended that you not catch general exceptions such as ThrowableExceptionor Error. Doing so can be potentially problematic. Such tests are generally not recommended, since the output can be sensitive to the locale in which the are run, and may contain other details which may be hard to maintain, such as line numbers.
If your workspace doesn't yet contain this directory, do a bringover from your integration workspace. Checking a test into the workspace against which it's first written helps prevent spurious test failures. In the case of a regression test, jtreg binary systems process ensures that a regression test will migrate upward in jtreg binary systems workspace hierarchy along with the fix for the corresponding bug.
If tests were managed separately from fixes then it would be difficult to distinguish between a true test failure and a failure due to version skew because the fix hadn't caught up with the test. Tests are generally organized following the structure of the Java API. For example, the test directory contains a java directory that has subdirectories langioutiletc.
Each package directory contains one subdirectory for each class in the package. Thus tests for java. Each class directory may contain a combination of single-file tests and further subdirectories for tests that require more than one source file. The test directory in your workspace may not contain a particular subdirectory since the source code management system creates directories only if they contain files that have been put under source code control.
If you run into this problem, just create the directory that you need and check your test files into it. The source code management system will take care of the rest. In general, try to give tests names that are as specific and descriptive as possible. If a test is checking jtreg binary systems behavior of one or a few methods, its name should include the names of those methods.
A test written for a bug that involves both the skip and available methods could be named SkipAvailable. Tests that involve many methods require a little more creativity in naming, since it would be unwieldy to include the names of all the methods. Just choose a descriptive word or short phrase. It can be helpful to add more information to the test name to help further describe the test. For example, a test that checks the skip method's behavior when passed a negative count could be named SkipNegative.
You might find that the name you want to give your test has already been taken. In this case either find a different name or, if you're just not in a creative mood, append an underscore and a digit to an existing name. Thus if there were already a Skip. Some tests require more than one source file, or may need access to data files. In this case it's best to create a subdirectory in order to keep related jtreg binary systems together.
The subdirectory should be given a descriptive jtreg binary systems name that begins with a lowercase letter.
Some tests involve more than one class in a package, in which case a new subdirectory in the relevant package directory should be created. For example, a set of general tests that exercise the character streams in the java. In addition to a java directory for API-related tests, the test directory contains a javax directory, vm directory, a tools directory, and com and sun directories.
When a test is run by the harness, a special classloader is used so that the classpath is effectively set to include just the directory containing the test, plus the standard system classes. Thus name clashes between tests in different directories are not a problem. An alternative approach would be to associate a different package with each test directory. This is done, for example, in the JCK test suite.
The difficulty with this idea is that in order to debug a test under dbx or workshop or jdb or whatever you must set up your classpath in just the right way. This makes it difficult to diagnose bugs that are reported against specific tests. Bugs in the graphical facilities of the JDK generally require manual interaction with applets.
Jtreg binary systems tests are written in much the same way as the simple main tests described above. The primary differences are that a second " " tag is given to indicate that the test is an applet test, and an appropriate HTML file is needed.
For example, an AWT test named Foo. The run tag tells the harness how to run the test. The remaining arguments to the run tag are passed to the program in a manner appropriate to the run type. In this case, the test will be run just as if the appletviewer had been invoked on Foo. When the harness runs a manual applet test, it will display the contents of the HTML file that defines the applet.
Include instructions jtreg binary systems the HTML file so that the person running the test can figure out what to do if any interaction is required. This allows the harness to distinguish manual from automatic tests, which is important since the latter can be run without user interaction. There are actually three kinds of applet manual tests: Self-contained tests, yesno tests, and done tests. A self-contained manual test handles jtreg binary systems user interaction itself.
If the test fails, whether this is determined by the user or by the applet, then the applet must throw an exception. A yesno test requests the harness to ask the user whether the test passes jtreg binary systems fails. To do this, the harness will put up pass and fail buttons, and it's up to the user to inspect the screen and click one of the buttons. The harness will take care of shutting down the applet.
The test will also fail if the applet throws an exception. A done test requests the harness to put up a done button.