Tests on METADATA PRs

So I thought I’ll document the tests we run on PRs to METADATA.jl, when you register new packages, or add new versions to existing packages.

One set of tests run on Travis. These tests validate the METADATA entries themselves. They enforce the policies we have with regards to package version numbers etc[1]. They do not run or verify any of your package code. These tests are mandatory, and the PR cannot be merged if the tests fail.

The second tests are designated as “JuliaCIBot” . These run the package tests for the package being tagged, as well as a subset of packages that depend on the package being tagged. All tests are run for the new tag, as well as the previous tag. The test is marked as a failure if the package tests fail, or any dependant packages has passing tests for the previous version, and had failing tests for this version.

The primary purpose of the JuliaCIBot tests is to ensure that the package code works, and that they do not break any packages that are dependent on that package. PRs are merged much quicker if they pass both the Travis and JuliaCIBot tests. While failures of JuliaCIBot are not currently a blocker for merging the PR, they do need to be investigated, and thus take more time and effort.

So herewith a few tips based on how we run these tests:

  • Please keep an eye out for the tests results after you submit your PR. There are situations where bugs in the infrastructure cause the tests to fail, so please ask if you see any failures that are unexpected.
  • Some packages cannot be tested automatically (for example if they need proprietary code or licenses, or access to cloud services). Exceptions have been created for many of those packages, but please ask for any that we have missed.
  • Tests are run using Pkg.test(). So any native dependencies should be installed using Bindeps. In particular, if you are installing dependencies in your .travis.yml, the tests will fail in our environment
  • Related to the above, the tests are run on an Ubuntu environment, with Julia 0.5 and 0.6 installed. If you use the AptGet provider for BinDeps, things should always work. We pre-install some major libraries with the image (python, R, zmq, gtk, imagemagick, matplotlib) but everything else is expected to be provided via BinDeps.

Hope this helps elucidate what is going on with the METADATA tests. Please ask (here, or on a METADATA PR) if anything needs clarification.

References:
[1] Current METADATA policy: https://github.com/JuliaLang/METADATA.jl/blob/metadata-v2/README.md
[2] Package development docs: https://docs.julialang.org/en/stable/manual/packages/#Package-Development-1

8 Likes

Thanks for the clarification Avik, as you know this happened to me recently. Perhaps you want to add the “Close PR, re-open PR” trick to trigger the tests again if METADATA tests fail for a non-package related issue

Is there any documentation on the exact build process used for the JuliaCIBot tests? I’m having some very strange test issues that are quite difficult to debug just from the logfile, and I’d like to be able to run the tests locally in Docker or similar.

The tests run a pretty vanilla ubuntu 16.04 environment within a docker container. It tests your package and its dependants at the current tag, and the previous registered tag. Ping me or Nishant (@nkottary) on the github issue if required, and we can investigate.

Ok, good to know. Thanks!