| <!DOCTYPE html> |
| <html xmlns="http://www.w3.org/1999/xhtml" lang="" xml:lang=""> |
| <head> |
| <meta charset="utf-8" /> |
| <meta name="generator" content="pandoc" /> |
| <meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" /> |
| <title>Testing the JDK</title> |
| <style> |
| code{white-space: pre-wrap;} |
| span.smallcaps{font-variant: small-caps;} |
| div.columns{display: flex; gap: min(4vw, 1.5em);} |
| div.column{flex: auto; overflow-x: auto;} |
| div.hanging-indent{margin-left: 1.5em; text-indent: -1.5em;} |
| ul.task-list{list-style: none;} |
| ul.task-list li input[type="checkbox"] { |
| width: 0.8em; |
| margin: 0 0.8em 0.2em -1.6em; |
| vertical-align: middle; |
| } |
| .display.math{display: block; text-align: center; margin: 0.5rem auto;} |
| </style> |
| <link rel="stylesheet" href="../make/data/docs-resources/resources/jdk-default.css" /> |
| <style type="text/css">pre, code, tt { color: #1d6ae5; }</style> |
| <!--[if lt IE 9]> |
| <script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.7.3/html5shiv-printshiv.min.js"></script> |
| <![endif]--> |
| </head> |
| <body> |
| <header id="title-block-header"> |
| <h1 class="title">Testing the JDK</h1> |
| </header> |
| <nav id="TOC" role="doc-toc"> |
| <ul> |
| <li><a href="#overview" id="toc-overview">Overview</a></li> |
| <li><a href="#running-tests-locally-with-make-test" |
| id="toc-running-tests-locally-with-make-test">Running tests locally with |
| <code>make test</code></a> |
| <ul> |
| <li><a href="#configuration" |
| id="toc-configuration">Configuration</a></li> |
| </ul></li> |
| <li><a href="#test-selection" id="toc-test-selection">Test selection</a> |
| <ul> |
| <li><a href="#common-test-groups" id="toc-common-test-groups">Common |
| Test Groups</a></li> |
| <li><a href="#jtreg" id="toc-jtreg">JTReg</a></li> |
| <li><a href="#gtest" id="toc-gtest">Gtest</a></li> |
| <li><a href="#microbenchmarks" |
| id="toc-microbenchmarks">Microbenchmarks</a></li> |
| <li><a href="#special-tests" id="toc-special-tests">Special |
| tests</a></li> |
| </ul></li> |
| <li><a href="#test-results-and-summary" |
| id="toc-test-results-and-summary">Test results and summary</a></li> |
| <li><a href="#test-suite-control" id="toc-test-suite-control">Test suite |
| control</a> |
| <ul> |
| <li><a href="#general-keywords-test_opts" |
| id="toc-general-keywords-test_opts">General keywords |
| (TEST_OPTS)</a></li> |
| <li><a href="#jtreg-keywords" id="toc-jtreg-keywords">JTReg |
| keywords</a></li> |
| <li><a href="#gtest-keywords" id="toc-gtest-keywords">Gtest |
| keywords</a></li> |
| <li><a href="#microbenchmark-keywords" |
| id="toc-microbenchmark-keywords">Microbenchmark keywords</a></li> |
| </ul></li> |
| <li><a href="#notes-for-specific-tests" |
| id="toc-notes-for-specific-tests">Notes for Specific Tests</a> |
| <ul> |
| <li><a href="#docker-tests" id="toc-docker-tests">Docker Tests</a></li> |
| <li><a href="#non-us-locale" id="toc-non-us-locale">Non-US |
| locale</a></li> |
| <li><a href="#pkcs11-tests" id="toc-pkcs11-tests">PKCS11 Tests</a></li> |
| <li><a href="#client-ui-tests" id="toc-client-ui-tests">Client UI |
| Tests</a></li> |
| </ul></li> |
| <li><a href="#editing-this-document" |
| id="toc-editing-this-document">Editing this document</a></li> |
| </ul> |
| </nav> |
| <h2 id="overview">Overview</h2> |
| <p>The bulk of JDK tests use <a |
| href="https://openjdk.org/jtreg/">jtreg</a>, a regression test framework |
| and test runner built for the JDK's specific needs. Other test |
| frameworks are also used. The different test frameworks can be executed |
| directly, but there is also a set of make targets intended to simplify |
| the interface, and figure out how to run your tests for you.</p> |
| <h2 id="running-tests-locally-with-make-test">Running tests locally with |
| <code>make test</code></h2> |
| <p>This is the easiest way to get started. Assuming you've built the JDK |
| locally, execute:</p> |
| <pre><code>$ make test</code></pre> |
| <p>This will run a default set of tests against the JDK, and present you |
| with the results. <code>make test</code> is part of a family of |
| test-related make targets which simplify running tests, because they |
| invoke the various test frameworks for you. The "make test framework" is |
| simple to start with, but more complex ad-hoc combination of tests is |
| also possible. You can always invoke the test frameworks directly if you |
| want even more control.</p> |
| <p>Some example command-lines:</p> |
| <pre><code>$ make test-tier1 |
| $ make test-jdk_lang JTREG="JOBS=8" |
| $ make test TEST=jdk_lang |
| $ make test-only TEST="gtest:LogTagSet gtest:LogTagSetDescriptions" GTEST="REPEAT=-1" |
| $ make test TEST="hotspot:hotspot_gc" JTREG="JOBS=1;TIMEOUT_FACTOR=8;JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug" |
| $ make test TEST="jtreg:test/hotspot:hotspot_gc test/hotspot/jtreg/native_sanity/JniVersion.java" |
| $ make test TEST="micro:java.lang.reflect" MICRO="FORK=1;WARMUP_ITER=2" |
| $ make exploded-test TEST=tier2</code></pre> |
| <p>"tier1" and "tier2" refer to tiered testing, see further down. "TEST" |
| is a test selection argument which the make test framework will use to |
| try to find the tests you want. It iterates over the available test |
| frameworks, and if the test isn't present in one, it tries the next one. |
| The main target <code>test</code> uses the jdk-image as the tested |
| product. There is also an alternate target <code>exploded-test</code> |
| that uses the exploded image instead. Not all tests will run |
| successfully on the exploded image, but using this target can greatly |
| improve rebuild times for certain workflows.</p> |
| <p>Previously, <code>make test</code> was used to invoke an old system |
| for running tests, and <code>make run-test</code> was used for the new |
| test framework. For backward compatibility with scripts and muscle |
| memory, <code>run-test</code> and variants like |
| <code>exploded-run-test</code> or <code>run-test-tier1</code> are kept |
| as aliases.</p> |
| <h3 id="configuration">Configuration</h3> |
| <p>To be able to run JTReg tests, <code>configure</code> needs to know |
| where to find the JTReg test framework. If it is not picked up |
| automatically by configure, use the |
| <code>--with-jtreg=<path to jtreg home></code> option to point to |
| the JTReg framework. Note that this option should point to the JTReg |
| home, i.e. the top directory, containing <code>lib/jtreg.jar</code> etc. |
| (An alternative is to set the <code>JT_HOME</code> environment variable |
| to point to the JTReg home before running <code>configure</code>.)</p> |
| <p>To be able to run microbenchmarks, <code>configure</code> needs to |
| know where to find the JMH dependency. Use |
| <code>--with-jmh=<path to JMH jars></code> to point to a directory |
| containing the core JMH and transitive dependencies. The recommended |
| dependencies can be retrieved by running |
| <code>sh make/devkit/createJMHBundle.sh</code>, after which |
| <code>--with-jmh=build/jmh/jars</code> should work.</p> |
| <p>When tests fail or timeout, jtreg runs its failure handler to capture |
| necessary data from the system where the test was run. This data can |
| then be used to analyze the test failures. Collecting this data involves |
| running various commands (which are listed in files residing in |
| <code>test/failure_handler/src/share/conf</code>) and some of these |
| commands use <code>sudo</code>. If the system's <code>sudoers</code> |
| file isn't configured to allow running these commands, then it can |
| result in password being prompted during the failure handler execution. |
| Typically, when running locally, collecting this additional data isn't |
| always necessary. To disable running the failure handler, use |
| <code>--enable-jtreg-failure-handler=no</code> when running |
| <code>configure</code>. If, however, you want to let the failure handler |
| to run and don't want to be prompted for sudo password, then you can |
| configure your <code>sudoers</code> file appropriately. Please read the |
| necessary documentation of your operating system to see how to do that; |
| here we only show one possible way of doing that - edit the |
| <code>/etc/sudoers.d/sudoers</code> file to include the following |
| line:</p> |
| <pre><code>johndoe ALL=(ALL) NOPASSWD: /sbin/dmesg</code></pre> |
| <p>This line configures <code>sudo</code> to <em>not</em> prompt for |
| password for the <code>/sbin/dmesg</code> command (this is one of the |
| commands that is listed in the files at |
| <code>test/failure_handler/src/share/conf</code>), for the user |
| <code>johndoe</code>. Here <code>johndoe</code> is the user account |
| under which the jtreg tests are run. Replace the username with a |
| relevant user account of your system.</p> |
| <h2 id="test-selection">Test selection</h2> |
| <p>All functionality is available using the <code>test</code> make |
| target. In this use case, the test or tests to be executed is controlled |
| using the <code>TEST</code> variable. To speed up subsequent test runs |
| with no source code changes, <code>test-only</code> can be used instead, |
| which do not depend on the source and test image build.</p> |
| <p>For some common top-level tests, direct make targets have been |
| generated. This includes all JTReg test groups, the hotspot gtest, and |
| custom tests (if present). This means that <code>make test-tier1</code> |
| is equivalent to <code>make test TEST="tier1"</code>, but the latter is |
| more tab-completion friendly. For more complex test runs, the |
| <code>test TEST="x"</code> solution needs to be used.</p> |
| <p>The test specifications given in <code>TEST</code> is parsed into |
| fully qualified test descriptors, which clearly and unambigously show |
| which tests will be run. As an example, <code>:tier1</code> will expand |
| to include all subcomponent test directories that define `tier1`, |
| for example: |
| <code>jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1 jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1 ...</code>. |
| You can always submit a list of fully qualified test descriptors in the |
| <code>TEST</code> variable if you want to shortcut the parser.</p> |
| <h3 id="common-test-groups">Common Test Groups</h3> |
| <p>Ideally, all tests are run for every change but this may not be |
| practical due to the limited testing resources, the scope of the change, |
| etc.</p> |
| <p>The source tree currently defines a few common test groups in the |
| relevant <code>TEST.groups</code> files. There are test groups that |
| cover a specific component, for example <code>hotspot_gc</code>. It is a |
| good idea to look into <code>TEST.groups</code> files to get a sense |
| what tests are relevant to a particular JDK component.</p> |
| <p>Component-specific tests may miss some unintended consequences of a |
| change, so other tests should also be run. Again, it might be |
| impractical to run all tests, and therefore <em>tiered</em> test groups |
| exist. Tiered test groups are not component-specific, but rather cover |
| the significant parts of the entire JDK.</p> |
| <p>Multiple tiers allow balancing test coverage and testing costs. Lower |
| test tiers are supposed to contain the simpler, quicker and more stable |
| tests. Higher tiers are supposed to contain progressively more thorough, |
| slower, and sometimes less stable tests, or the tests that require |
| special configuration.</p> |
| <p>Contributors are expected to run the tests for the areas that are |
| changed, and the first N tiers they can afford to run, but at least |
| tier1.</p> |
| <p>A brief description of the tiered test groups:</p> |
| <ul> |
| <li><p><code>tier1</code>: This is the lowest test tier. Multiple |
| developers run these tests every day. Because of the widespread use, the |
| tests in <code>tier1</code> are carefully selected and optimized to run |
| fast, and to run in the most stable manner. The test failures in |
| <code>tier1</code> are usually followed up on quickly, either with |
| fixes, or adding relevant tests to problem list. GitHub Actions |
| workflows, if enabled, run <code>tier1</code> tests.</p></li> |
| <li><p><code>tier2</code>: This test group covers even more ground. |
| These contain, among other things, tests that either run for too long to |
| be at <code>tier1</code>, or may require special configuration, or tests |
| that are less stable, or cover the broader range of non-core JVM and JDK |
| features/components(for example, XML).</p></li> |
| <li><p><code>tier3</code>: This test group includes more stressful |
| tests, the tests for corner cases not covered by previous tiers, plus |
| the tests that require GUIs. As such, this suite should either be run |
| with low concurrency (<code>TEST_JOBS=1</code>), or without headful |
| tests(<code>JTREG_KEYWORDS=\!headful</code>), or both.</p></li> |
| <li><p><code>tier4</code>: This test group includes every other test not |
| covered by previous tiers. It includes, for example, |
| <code>vmTestbase</code> suites for Hotspot, which run for many hours |
| even on large machines. It also runs GUI tests, so the same |
| <code>TEST_JOBS</code> and <code>JTREG_KEYWORDS</code> caveats |
| apply.</p></li> |
| </ul> |
| <h3 id="jtreg">JTReg</h3> |
| <p>JTReg tests can be selected either by picking a JTReg test group, or |
| a selection of files or directories containing JTReg tests. |
| Documentation can be found at <a |
| href="https://openjdk.org/jtreg/">https://openjdk.org/jtreg/</a>, note |
| especially the extensive <a |
| href="https://openjdk.org/jtreg/faq.html">FAQ</a>.</p> |
| <p>JTReg test groups can be specified either without a test root, e.g. |
| <code>:tier1</code> (or <code>tier1</code>, the initial colon is |
| optional), or with, e.g. <code>hotspot:tier1</code>, |
| <code>test/jdk:jdk_util</code> or |
| <code>$(TOPDIR)/test/hotspot/jtreg:hotspot_all</code>. The test root can |
| be specified either as an absolute path, or a path relative to the JDK |
| top directory, or the <code>test</code> directory. For simplicity, the |
| hotspot JTReg test root, which really is <code>hotspot/jtreg</code> can |
| be abbreviated as just <code>hotspot</code>.</p> |
| <p>When specified without a test root, all matching groups from all test |
| roots will be added. Otherwise, only the group from the specified test |
| root will be added.</p> |
| <p>Individual JTReg tests or directories containing JTReg tests can also |
| be specified, like |
| <code>test/hotspot/jtreg/native_sanity/JniVersion.java</code> or |
| <code>hotspot/jtreg/native_sanity</code>. Just like for test root |
| selection, you can either specify an absolute path (which can even point |
| to JTReg tests outside the source tree), or a path relative to either |
| the JDK top directory or the <code>test</code> directory. |
| <code>hotspot</code> can be used as an alias for |
| <code>hotspot/jtreg</code> here as well.</p> |
| <p>As long as the test groups or test paths can be uniquely resolved, |
| you do not need to enter the <code>jtreg:</code> prefix. If this is not |
| possible, or if you want to use a fully qualified test descriptor, add |
| <code>jtreg:</code>, e.g. |
| <code>jtreg:test/hotspot/jtreg/native_sanity</code>.</p> |
| <h3 id="gtest">Gtest</h3> |
| <p><strong>Note:</strong> To be able to run the Gtest suite, you need to |
| configure your build to be able to find a proper version of the gtest |
| source. For details, see the section <a |
| href="building.html#running-tests">"Running Tests" in the build |
| documentation</a>.</p> |
| <p>Since the Hotspot Gtest suite is so quick, the default is to run all |
| tests. This is specified by just <code>gtest</code>, or as a fully |
| qualified test descriptor <code>gtest:all</code>.</p> |
| <p>If you want, you can single out an individual test or a group of |
| tests, for instance <code>gtest:LogDecorations</code> or |
| <code>gtest:LogDecorations.level_test_vm</code>. This can be |
| particularly useful if you want to run a shaky test repeatedly.</p> |
| <p>For Gtest, there is a separate test suite for each JVM variant. The |
| JVM variant is defined by adding <code>/<variant></code> to the |
| test descriptor, e.g. <code>gtest:Log/client</code>. If you specify no |
| variant, gtest will run once for each JVM variant present (e.g. server, |
| client). So if you only have the server JVM present, then |
| <code>gtest:all</code> will be equivalent to |
| <code>gtest:all/server</code>.</p> |
| <h3 id="microbenchmarks">Microbenchmarks</h3> |
| <p>Which microbenchmarks to run is selected using a regular expression |
| following the <code>micro:</code> test descriptor, e.g., |
| <code>micro:java.lang.reflect</code>. This delegates the test selection |
| to JMH, meaning package name, class name and even benchmark method names |
| can be used to select tests.</p> |
| <p>Using special characters like <code>|</code> in the regular |
| expression is possible, but needs to be escaped multiple times: |
| <code>micro:ArrayCopy\\\\\|reflect</code>.</p> |
| <h3 id="special-tests">Special tests</h3> |
| <p>A handful of odd tests that are not covered by any other testing |
| framework are accessible using the <code>special:</code> test |
| descriptor. Currently, this includes <code>failure-handler</code> and |
| <code>make</code>.</p> |
| <ul> |
| <li><p>Failure handler testing is run using |
| <code>special:failure-handler</code> or just |
| <code>failure-handler</code> as test descriptor.</p></li> |
| <li><p>Tests for the build system, including both makefiles and related |
| functionality, is run using <code>special:make</code> or just |
| <code>make</code> as test descriptor. This is equivalent to |
| <code>special:make:all</code>.</p> |
| <p>A specific make test can be run by supplying it as argument, e.g. |
| <code>special:make:idea</code>. As a special syntax, this can also be |
| expressed as <code>make-idea</code>, which allows for command lines as |
| <code>make test-make-idea</code>.</p></li> |
| </ul> |
| <h2 id="test-results-and-summary">Test results and summary</h2> |
| <p>At the end of the test run, a summary of all tests run will be |
| presented. This will have a consistent look, regardless of what test |
| suites were used. This is a sample summary:</p> |
| <pre><code>============================== |
| Test summary |
| ============================== |
| TEST TOTAL PASS FAIL ERROR |
| >> jtreg:jdk/test:tier1 1867 1865 2 0 << |
| jtreg:langtools/test:tier1 4711 4711 0 0 |
| jtreg:nashorn/test:tier1 133 133 0 0 |
| ============================== |
| TEST FAILURE</code></pre> |
| <p>Tests where the number of TOTAL tests does not equal the number of |
| PASSed tests will be considered a test failure. These are marked with |
| the <code>>> ... <<</code> marker for easy |
| identification.</p> |
| <p>The classification of non-passed tests differs a bit between test |
| suites. In the summary, ERROR is used as a catch-all for tests that |
| neither passed nor are classified as failed by the framework. This might |
| indicate test framework error, timeout or other problems.</p> |
| <p>In case of test failures, <code>make test</code> will exit with a |
| non-zero exit value.</p> |
| <p>All tests have their result stored in |
| <code>build/$BUILD/test-results/$TEST_ID</code>, where TEST_ID is a |
| path-safe conversion from the fully qualified test descriptor, e.g. for |
| <code>jtreg:jdk/test:tier1</code> the TEST_ID is |
| <code>jtreg_jdk_test_tier1</code>. This path is also printed in the log |
| at the end of the test run.</p> |
| <p>Additional work data is stored in |
| <code>build/$BUILD/test-support/$TEST_ID</code>. For some frameworks, |
| this directory might contain information that is useful in determining |
| the cause of a failed test.</p> |
| <h2 id="test-suite-control">Test suite control</h2> |
| <p>It is possible to control various aspects of the test suites using |
| make control variables.</p> |
| <p>These variables use a keyword=value approach to allow multiple values |
| to be set. So, for instance, |
| <code>JTREG="JOBS=1;TIMEOUT_FACTOR=8"</code> will set the JTReg |
| concurrency level to 1 and the timeout factor to 8. This is equivalent |
| to setting <code>JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8</code>, but using |
| the keyword format means that the <code>JTREG</code> variable is parsed |
| and verified for correctness, so <code>JTREG="TMIEOUT_FACTOR=8"</code> |
| would give an error, while <code>JTREG_TMIEOUT_FACTOR=8</code> would |
| just pass unnoticed.</p> |
| <p>To separate multiple keyword=value pairs, use <code>;</code> |
| (semicolon). Since the shell normally eats <code>;</code>, the |
| recommended usage is to write the assignment inside qoutes, e.g. |
| <code>JTREG="...;..."</code>. This will also make sure spaces are |
| preserved, as in |
| <code>JTREG="JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"</code>.</p> |
| <p>(Other ways are possible, e.g. using backslash: |
| <code>JTREG=JOBS=1\;TIMEOUT_FACTOR=8</code>. Also, as a special |
| technique, the string <code>%20</code> will be replaced with space for |
| certain options, e.g. |
| <code>JTREG=JAVA_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug</code>. |
| This can be useful if you have layers of scripts and have trouble |
| getting proper quoting of command line arguments through.)</p> |
| <p>As far as possible, the names of the keywords have been standardized |
| between test suites.</p> |
| <h3 id="general-keywords-test_opts">General keywords (TEST_OPTS)</h3> |
| <p>Some keywords are valid across different test suites. If you want to |
| run tests from multiple test suites, or just don't want to care which |
| test suite specific control variable to use, then you can use the |
| general TEST_OPTS control variable.</p> |
| <p>There are also some keywords that applies globally to the test runner |
| system, not to any specific test suites. These are also available as |
| TEST_OPTS keywords.</p> |
| <h4 id="jobs">JOBS</h4> |
| <p>Currently only applies to JTReg.</p> |
| <h4 id="timeout_factor">TIMEOUT_FACTOR</h4> |
| <p>Currently only applies to JTReg.</p> |
| <h4 id="java_options">JAVA_OPTIONS</h4> |
| <p>Applies to JTReg, GTest and Micro.</p> |
| <h4 id="vm_options">VM_OPTIONS</h4> |
| <p>Applies to JTReg, GTest and Micro.</p> |
| <h4 id="aot_modules">AOT_MODULES</h4> |
| <p>Applies to JTReg and GTest.</p> |
| <h4 id="jcov">JCOV</h4> |
| <p>This keywords applies globally to the test runner system. If set to |
| <code>true</code>, it enables JCov coverage reporting for all tests run. |
| To be useful, the JDK under test must be run with a JDK built with JCov |
| instrumentation |
| (<code>configure --with-jcov=<path to directory containing lib/jcov.jar></code>, |
| <code>make jcov-image</code>).</p> |
| <p>The simplest way to run tests with JCov coverage report is to use the |
| special target <code>jcov-test</code> instead of <code>test</code>, e.g. |
| <code>make jcov-test TEST=jdk_lang</code>. This will make sure the JCov |
| image is built, and that JCov reporting is enabled.</p> |
| <p>The JCov report is stored in |
| <code>build/$BUILD/test-results/jcov-output/report</code>.</p> |
| <p>Please note that running with JCov reporting can be very memory |
| intensive.</p> |
| <h4 id="jcov_diff_changeset">JCOV_DIFF_CHANGESET</h4> |
| <p>While collecting code coverage with JCov, it is also possible to find |
| coverage for only recently changed code. JCOV_DIFF_CHANGESET specifies a |
| source revision. A textual report will be generated showing coverage of |
| the diff between the specified revision and the repository tip.</p> |
| <p>The report is stored in |
| <code>build/$BUILD/test-results/jcov-output/diff_coverage_report</code> |
| file.</p> |
| <h3 id="jtreg-keywords">JTReg keywords</h3> |
| <h4 id="jobs-1">JOBS</h4> |
| <p>The test concurrency (<code>-concurrency</code>).</p> |
| <p>Defaults to TEST_JOBS (if set by <code>--with-test-jobs=</code>), |
| otherwise it defaults to JOBS, except for Hotspot, where the default is |
| <em>number of CPU cores/2</em>, but never more than <em>memory size in |
| GB/2</em>.</p> |
| <h4 id="timeout_factor-1">TIMEOUT_FACTOR</h4> |
| <p>The timeout factor (<code>-timeoutFactor</code>).</p> |
| <p>Defaults to 4.</p> |
| <h4 id="failure_handler_timeout">FAILURE_HANDLER_TIMEOUT</h4> |
| <p>Sets the argument <code>-timeoutHandlerTimeout</code> for JTReg. The |
| default value is 0. This is only valid if the failure handler is |
| built.</p> |
| <h4 id="jtreg_test_thread_factory">JTREG_TEST_THREAD_FACTORY</h4> |
| <p>Sets the <code>-testThreadFactory</code> for JTReg. It should be the |
| fully qualified classname of a class which implements |
| <code>java.util.concurrent.ThreadFactory</code>. One such implementation |
| class, named Virtual, is currently part of the JDK build in the |
| <code>test/jtreg_test_thread_factory/</code> directory. This class gets |
| compiled during the test image build. The implementation of the Virtual |
| class creates a new virtual thread for executing each test class.</p> |
| <h4 id="test_mode">TEST_MODE</h4> |
| <p>The test mode (<code>agentvm</code> or <code>othervm</code>).</p> |
| <p>Defaults to <code>agentvm</code>.</p> |
| <h4 id="assert">ASSERT</h4> |
| <p>Enable asserts (<code>-ea -esa</code>, or none).</p> |
| <p>Set to <code>true</code> or <code>false</code>. If true, adds |
| <code>-ea -esa</code>. Defaults to true, except for hotspot.</p> |
| <h4 id="verbose">VERBOSE</h4> |
| <p>The verbosity level (<code>-verbose</code>).</p> |
| <p>Defaults to <code>fail,error,summary</code>.</p> |
| <h4 id="retain">RETAIN</h4> |
| <p>What test data to retain (<code>-retain</code>).</p> |
| <p>Defaults to <code>fail,error</code>.</p> |
| <h4 id="max_mem">MAX_MEM</h4> |
| <p>Limit memory consumption (<code>-Xmx</code> and |
| <code>-vmoption:-Xmx</code>, or none).</p> |
| <p>Limit memory consumption for JTReg test framework and VM under test. |
| Set to 0 to disable the limits.</p> |
| <p>Defaults to 512m, except for hotspot, where it defaults to 0 (no |
| limit).</p> |
| <h4 id="max_output">MAX_OUTPUT</h4> |
| <p>Set the property <code>javatest.maxOutputSize</code> for the |
| launcher, to change the default JTReg log limit.</p> |
| <h4 id="keywords">KEYWORDS</h4> |
| <p>JTReg keywords sent to JTReg using <code>-k</code>. Please be careful |
| in making sure that spaces and special characters (like <code>!</code>) |
| are properly quoted. To avoid some issues, the special value |
| <code>%20</code> can be used instead of space.</p> |
| <h4 id="extra_problem_lists">EXTRA_PROBLEM_LISTS</h4> |
| <p>Use additional problem lists file or files, in addition to the |
| default ProblemList.txt located at the JTReg test roots.</p> |
| <p>If multiple file names are specified, they should be separated by |
| space (or, to help avoid quoting issues, the special value |
| <code>%20</code>).</p> |
| <p>The file names should be either absolute, or relative to the JTReg |
| test root of the tests to be run.</p> |
| <h4 id="run_problem_lists">RUN_PROBLEM_LISTS</h4> |
| <p>Use the problem lists to select tests instead of excluding them.</p> |
| <p>Set to <code>true</code> or <code>false</code>. If <code>true</code>, |
| JTReg will use <code>-match:</code> option, otherwise |
| <code>-exclude:</code> will be used. Default is <code>false</code>.</p> |
| <h4 id="options">OPTIONS</h4> |
| <p>Additional options to the JTReg test framework.</p> |
| <p>Use <code>JTREG="OPTIONS=--help all"</code> to see all available |
| JTReg options.</p> |
| <h4 id="java_options-1">JAVA_OPTIONS</h4> |
| <p>Additional Java options for running test classes (sent to JTReg as |
| <code>-javaoption</code>).</p> |
| <h4 id="vm_options-1">VM_OPTIONS</h4> |
| <p>Additional Java options to be used when compiling and running classes |
| (sent to JTReg as <code>-vmoption</code>).</p> |
| <p>This option is only needed in special circumstances. To pass Java |
| options to your test classes, use <code>JAVA_OPTIONS</code>.</p> |
| <h4 id="launcher_options">LAUNCHER_OPTIONS</h4> |
| <p>Additional Java options that are sent to the java launcher that |
| starts the JTReg harness.</p> |
| <h4 id="aot_modules-1">AOT_MODULES</h4> |
| <p>Generate AOT modules before testing for the specified module, or set |
| of modules. If multiple modules are specified, they should be separated |
| by space (or, to help avoid quoting issues, the special value |
| <code>%20</code>).</p> |
| <h4 id="retry_count">RETRY_COUNT</h4> |
| <p>Retry failed tests up to a set number of times, until they pass. This |
| allows to pass the tests with intermittent failures. Defaults to 0.</p> |
| <h4 id="repeat_count">REPEAT_COUNT</h4> |
| <p>Repeat the tests up to a set number of times, stopping at first |
| failure. This helps to reproduce intermittent test failures. Defaults to |
| 0.</p> |
| <h4 id="report">REPORT</h4> |
| <p>Use this report style when reporting test results (sent to JTReg as |
| <code>-report</code>). Defaults to <code>files</code>.</p> |
| <h3 id="gtest-keywords">Gtest keywords</h3> |
| <h4 id="repeat">REPEAT</h4> |
| <p>The number of times to repeat the tests |
| (<code>--gtest_repeat</code>).</p> |
| <p>Default is 1. Set to -1 to repeat indefinitely. This can be |
| especially useful combined with |
| <code>OPTIONS=--gtest_break_on_failure</code> to reproduce an |
| intermittent problem.</p> |
| <h4 id="options-1">OPTIONS</h4> |
| <p>Additional options to the Gtest test framework.</p> |
| <p>Use <code>GTEST="OPTIONS=--help"</code> to see all available Gtest |
| options.</p> |
| <h4 id="aot_modules-2">AOT_MODULES</h4> |
| <p>Generate AOT modules before testing for the specified module, or set |
| of modules. If multiple modules are specified, they should be separated |
| by space (or, to help avoid quoting issues, the special value |
| <code>%20</code>).</p> |
| <h3 id="microbenchmark-keywords">Microbenchmark keywords</h3> |
| <h4 id="fork">FORK</h4> |
| <p>Override the number of benchmark forks to spawn. Same as specifying |
| <code>-f <num></code>.</p> |
| <h4 id="iter">ITER</h4> |
| <p>Number of measurement iterations per fork. Same as specifying |
| <code>-i <num></code>.</p> |
| <h4 id="time">TIME</h4> |
| <p>Amount of time to spend in each measurement iteration, in seconds. |
| Same as specifying <code>-r <num></code></p> |
| <h4 id="warmup_iter">WARMUP_ITER</h4> |
| <p>Number of warmup iterations to run before the measurement phase in |
| each fork. Same as specifying <code>-wi <num></code>.</p> |
| <h4 id="warmup_time">WARMUP_TIME</h4> |
| <p>Amount of time to spend in each warmup iteration. Same as specifying |
| <code>-w <num></code>.</p> |
| <h4 id="results_format">RESULTS_FORMAT</h4> |
| <p>Specify to have the test run save a log of the values. Accepts the |
| same values as <code>-rff</code>, i.e., <code>text</code>, |
| <code>csv</code>, <code>scsv</code>, <code>json</code>, or |
| <code>latex</code>.</p> |
| <h4 id="vm_options-2">VM_OPTIONS</h4> |
| <p>Additional VM arguments to provide to forked off VMs. Same as |
| <code>-jvmArgs <args></code></p> |
| <h4 id="options-2">OPTIONS</h4> |
| <p>Additional arguments to send to JMH.</p> |
| <h2 id="notes-for-specific-tests">Notes for Specific Tests</h2> |
| <h3 id="docker-tests">Docker Tests</h3> |
| <p>Docker tests with default parameters may fail on systems with glibc |
| versions not compatible with the one used in the default docker image |
| (e.g., Oracle Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 |
| but fail on Ubuntu 18.04 if run like this on x86:</p> |
| <pre><code>$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"</code></pre> |
| <p>To run these tests correctly, additional parameters for the correct |
| docker image are required on Ubuntu 18.04 by using |
| <code>JAVA_OPTIONS</code>.</p> |
| <pre><code>$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" \ |
| JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu |
| -Djdk.test.docker.image.version=latest"</code></pre> |
| <h3 id="non-us-locale">Non-US locale</h3> |
| <p>If your locale is non-US, some tests are likely to fail. To work |
| around this you can set the locale to US. On Unix platforms simply |
| setting <code>LANG="en_US"</code> in the environment before running |
| tests should work. On Windows or MacOS, setting |
| <code>JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"</code> |
| helps for most, but not all test cases.</p> |
| <p>For example:</p> |
| <pre><code>$ export LANG="en_US" && make test TEST=... |
| $ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...</code></pre> |
| <h3 id="pkcs11-tests">PKCS11 Tests</h3> |
| <p>It is highly recommended to use the latest NSS version when running |
| PKCS11 tests. Improper NSS version may lead to unexpected failures which |
| are hard to diagnose. For example, |
| sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu 18.04 |
| with the default NSS version in the system. To run these tests |
| correctly, the system property |
| <code>jdk.test.lib.artifacts.<NAME></code> is required on Ubuntu |
| 18.04 to specify the alternative NSS lib directory. The |
| <code><NAME></code> component should be replaced with the name |
| element of the appropriate <code>@Artifact</code> class. (See |
| <code>test/jdk/sun/security/pkcs11/PKCS11Test.java</code>)</p> |
| <p>For example:</p> |
| <pre><code>$ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" \ |
| JTREG="JAVA_OPTIONS=-Djdk.test.lib.artifacts.nsslib-linux_aarch64=/path/to/NSS-libs"</code></pre> |
| <p>For more notes about the PKCS11 tests, please refer to |
| test/jdk/sun/security/pkcs11/README.</p> |
| <h3 id="client-ui-tests">Client UI Tests</h3> |
| <h4 id="system-key-shortcuts">System key shortcuts</h4> |
| <p>Some Client UI tests use key sequences which may be reserved by the |
| operating system. Usually that causes the test failure. So it is highly |
| recommended to disable system key shortcuts prior testing. The steps to |
| access and disable system key shortcuts for various platforms are |
| provided below.</p> |
| <h5 id="macos">macOS</h5> |
| <p>Choose Apple menu; System Preferences, click Keyboard, then click |
| Shortcuts; select or deselect desired shortcut.</p> |
| <p>For example, |
| test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java |
| fails on MacOS because it uses <code>CTRL + F1</code> key sequence to |
| show or hide tooltip message but the key combination is reserved by the |
| operating system. To run the test correctly the default global key |
| shortcut should be disabled using the steps described above, and then |
| deselect "Turn keyboard access on or off" option which is responsible |
| for <code>CTRL + F1</code> combination.</p> |
| <h5 id="linux">Linux</h5> |
| <p>Open the Activities overview and start typing Settings; Choose |
| Settings, click Devices, then click Keyboard; set or override desired |
| shortcut.</p> |
| <h5 id="windows">Windows</h5> |
| <p>Type <code>gpedit</code> in the Search and then click Edit group |
| policy; navigate to User Configuration -> Administrative Templates |
| -> Windows Components -> File Explorer; in the right-side pane |
| look for "Turn off Windows key hotkeys" and double click on it; enable |
| or disable hotkeys.</p> |
| <p>Note: restart is required to make the settings take effect.</p> |
| <h4 id="robot-api">Robot API</h4> |
| <p>Most automated Client UI tests use <code>Robot</code> API to control |
| the UI. Usually, the default operating system settings need to be |
| adjusted for Robot to work correctly. The detailed steps how to access |
| and update these settings for different platforms are provided |
| below.</p> |
| <h5 id="macos-1">macOS</h5> |
| <p><code>Robot</code> is not permitted to control your Mac by default |
| since macOS 10.15. To allow it, choose Apple menu -> System Settings, |
| click Privacy & Security; then click Accessibility and ensure the |
| following apps are allowed to control your computer: <em>Java</em> and |
| <em>Terminal</em>. If the tests are run from an IDE, the IDE should be |
| granted this permission too.</p> |
| <h5 id="windows-1">Windows</h5> |
| <p>On Windows if Cygwin terminal is used to run the tests, there is a |
| delay in focus transfer. Usually it causes automated UI test failure. To |
| disable the delay, type <code>regedit</code> in the Search and then |
| select Registry Editor; navigate to the following key: |
| <code>HKEY_CURRENT_USER\Control Panel\Desktop</code>; make sure the |
| <code>ForegroundLockTimeout</code> value is set to 0.</p> |
| <p>Additional information about Client UI tests configuration for |
| various operating systems can be obtained at <a |
| href="https://wiki.openjdk.org/display/ClientLibs/Automated+client+GUI+testing+system+set+up+requirements">Automated |
| client GUI testing system set up requirements</a></p> |
| <h2 id="editing-this-document">Editing this document</h2> |
| <p>If you want to contribute changes to this document, edit |
| <code>doc/testing.md</code> and then run |
| <code>make update-build-docs</code> to generate the same changes in |
| <code>doc/testing.html</code>.</p> |
| </body> |
| </html> |