I am having trouble running Revise.jl on my Windows machine. My workflow is as follows. On VSCode I am writing my Julia module. I also have the Julia REPL open separately.
I’ve run the following code in my REPL
include("path/to/module")
using Revise
using .PackageName
a = foo() # foo() defined in PackageName
Now when I edit a function in PackageName in VSCode and switch back to the REPL, I was hoping that Revise would “reload” the module with the edited function. But this dosn’t seem to be the case. Am I missing something?
I believe you need to use Revise.includet in this case. Or you could develop the package in Pkg.devdir() (or somewhere else on the LOAD_PATH) and Revise will find it automatically.
navigate to where you want your package to live (I typically use .julia/dev)
within Julia do using PkgDev; PkgDev.generate("PackageName", "MIT") or whatever license you prefer.
pkg> dev PackageName to bring it into your environment (or pkg> dev /path/to/PackageName if you placed it somewhere other than your .julia/dev directory)
It is probably not relevant but I wonder if all this package creation process could not be even more standardized inside Julia in the future (avoiding “Recommend” and “I typically use”). If so, the IDE (atom, vscode,…) job for package creation would be easy and essentially reduced to a one click process (and you can switch between IDEs).
In C++, this kind of work is handled by IDEs like qtcreator or visual studio that generate a CMake project or equivalent setups. It is not convenient because it is not standard. I love to develop softs and libs but I always found configuration/environment/build/config management tool/ Testing tools/doc tools) so painful and boring to learn. Usually I consider the corresponding related “languages”/configuration file syntax (git, doxygen, cvs, travis, cmake,…) to be ugly.
Julia Pkg/doc/test approach is indeed a great improvement but I wonder if it could be even more standardized to the point where there is no more question about it
I have a similar problem to the original question about tracking local modules, but I am not sure if the suggested approach (using PkgTemplates) applies to my case. I already created a package. Within that package I created a local module and I want to include that module and use it (and track it with revise). So creating a package within a package does not make sense, I guess.
What would be the recommended workflow in this case? Not have local modules within a package and directly include a script full of functions? Or is there a way to track the local module with Revise?
OK, I described my setup wrongly. Let me give a MWE. I have a project, not a package, and a script and a module inside the project. Let’s say the script is MyProject.jl and the module Subroutines.jl.
Contents of MyProject.jl:
include(joinpath(@__DIR__, "Subroutines.jl"))
using .Subroutines
greet() = print("Hello World!")
foo()
Contents of Subroutines.jl
module Subroutines
export foo
foo() = "foo"
end
If I change in Subroutines.jlfoo()="goo", Revise does not track the change, i.e., calling foo() does not return "goo".
Revise deliberately does not track files loaded by include; otherwise it might, e.g., re-run all your tests every time you change a single line in one of your test scripts.
Revise does track things loaded by using/import or includet. As the help for includet specifies, it is deliberately non-recursive.
Is there any reason not to load your method definitions with something like using MyProject? That is, turn MyProject into an actual “package”?
Well, the project is not intended to be used as a package anywhere else, because it’s a folder where I process results, and does not export functions. I have several scripts to process results, and a single module that exports common subroutines to all scripts.
If I convert MyProject into a package then changes in Subroutines.jl are still not tracked. I change MyProject.jl into:
module MyProject
include(joinpath(@__DIR__, "Subroutines.jl"))
using .Subroutines
greet() = print("Hello World!")
foo()
end
I also add a Project.toml file and dev the package. If I later run the snippets of MyProject.jl with Juno, still changes in Subroutines.jl do not get tracked.
When I develop, I run snippets of code in Juno within the project/package, and I use local modules inside the project/package by first including them. Should I includet before using? Or is it that I cannot track changes of a module used inside a script? Does a module always need to be dev’ed and converted to a package in order to be tracked?
The way to get this workflow working would be to say using MyProject in the REPL, and then I can edit both MyProject.jl and Subroutines.jl and run statements within Juno (using shift+enter) in the main package file, and changes are tracked.
I think it would be nice though to be able to track local modules that explicitly have been called with using, regardless of them being dev’ed because they would need to be packages.
Mmmm, I don’t know. I started wrapping every file with functions I had in modules because I thought that was the recommended way to reuse code, both in packages and projects.
I can always remove the module call, and use includet within projects.
By mechanisms for getting list of files a package depends on, you refer to the Project.toml and Manifest.toml files?
Hey, I’m a bit late to this but I just want to share my solution. To use Revise with a module without having to wrap that module in a package, you can just add the folder to your LOAD_PATH:
push!(LOAD_PATH, pwd()) # or replace pwd() with whatever folder your module is in
using MyModule
I kind of wish the current working directory was in your LOAD_PATH by default. It seems like it’s discouraged to have a module that isn’t in a package, but I find there’s a real use case for this. For example when I need more structure than a single script can provide, but a package provides more structure than I need and I want to stay in my global package environment.