NBTesting is a simple utility for writing tests in your IJulia notebooks, alongside other plots and computations. How it works:
-
Add tests to your notebook using
Base.Test
, or your favorite testing framework.
- Use
NBTesting.nbtest("Water_Analysis.ipynb")
to run the notebook’s code and tests. It will create and execute a file called NBTest_Water_Analysis.jl.
- (Optional) Track this .jl file with git if all tests are successful.
nbtest
will mostly run the notebook code as is (similar to NBInclude.jl), but it provides a few ways to control which code gets executed when, and a verbose=...
option for printing the headers (on by default - see ?nbtest
for details). The code is wrapped inside a module called NBTest_[Notebook name]
, to isolate it from the current environment, and to make it easier to inspect the state of the notebook variables if a test fails.
3 Likes
This seems odd to me — why would you have to create a file in order to execute the code?
1 Like
You don’t have to. The goal is to produce a git-friendly .jl
file, to track the changes to the tests (eg. to make git-bisect
convenient - just include
the failing test file). Tracking Jupyter notebooks in git is painful.
You can use nbtest(...; outfile_name="/tmp/dummy.jl")
if you don’t care about it.
So you expect people to commit generated code into the git repo? That doesn’t sound ideal either.
I track Jupyter notebooks in git all the time… as long as they don’t contain binary data (e.g. images), it isn’t too bad.
And there are also tricks to get git to do a better job with notebooks, if needed: https://gist.github.com/pbugnion/ea2797393033b54674af
1 Like
Why not? I get why that’s usually a bad idea, but in this case, it’s just a cleaned up version of the .ipynb
.
Or plots… I don’t use NBTesting for writing “dedicated testing notebooks” (NBInclude.jl would be fine for that), but rather to make sure that any change to my code didn’t affect the results in my analysis/production notebooks.
That could work, too.