ERROR: LoadError: LoadError: UndefVarError: n_actions not defined

hey all, I tried to run SolveMDP.jl this code in julia1.1.1 on wsl ubuntu. I have installed all packages required but I had such an error::
“”–ERROR: LoadError: LoadError: UndefVarError: n_actions not defined
[1] getproperty(::Module, ::Symbol) at ./sysimg.jl:13
[2] top-level scope at none:0
[3] include at ./boot.jl:326 [inlined]
[4] include_relative(::Module, ::String) at ./loading.jl:1038
[5] include at ./sysimg.jl:29 [inlined]
[6] include(::String) at /home/xxxx/HorizontalCAS/GenerateTable/mdp/HCAS.jl:1
[7] top-level scope at none:0
[8] include at ./boot.jl:326 [inlined]
[9] include_relative(::Module, ::String) at ./loading.jl:1038
[10] include(::Module, ::String) at ./sysimg.jl:29
[11] top-level scope at none:2
[12] eval at ./boot.jl:328 [inlined]
[13] eval(::Expr) at ./client.jl:404
[14] top-level scope at ./none:3""
Could you please help me to overcome this error?

Welcome! There’s no example for us to run here, so we can’t really help you much. Please provide a minimal (non-)working example for us to run.

1 Like

I don’t have such a minimal example. Project is prettty complex. MDP I end up with given error when SolveMDP.jl is runned

relevant part of the code is
“# Define necessary functions for HCAS MDP
POMDPs.actionindex(::HCAS_MDP, a::actType) = a + 1
POMDPs.actions(mdp::HCAS_MDP) = ACTIONS = mdp.discount_factor
POMDPs.n_actions(::HCAS_MDP) = length(ACTIONS)” which is a part from h_CAS.jl

I think an MWE is quite simple in this case:

julia> using POMDPs

julia> POMDPs.n_actions
ERROR: UndefVarError: n_actions not defined
 [1] getproperty(x::Module, f::Symbol)
   @ Base .\Base.jl:35
 [2] top-level scope
   @ REPL[16]:100: 

I don’t know what the POMDPs package is or does, why do you think it should have an n_actions function? Often these kinds of errors arise from copying example code written for older versions of a package.


There doesn’t seem to be any Project.toml (where package dependencies are recorded) or Manifest.toml (where versions of used dependencies are recorded) present in that repo. I agree with @nilshg - make sure you have the versions of the packages used by the original author of that paper installed. It’s a shame they don’t seem to provide it in the repo, especially since julia usually makes it rather easy to do so. Your best bet is probably contacting the authors and seeing if you can find out which version of each dependency was used.

One option would be to try the version from this PR, where another kind soul already did that grunt work. Why it wasn’t merged in the end, I don’t know.

According to that PR, POMDPs is supposed to be version 0.7.0 - while the most up to date version is 0.9.3. Can you post the output of ]st from the REPL, so we can check which version you have installed?


How can I remove all packages and reinstall ?

Thank you all! removing and reinstalling all packages worked for that error. Now I have different kind of error
"ERROR: LoadError: ProcessExitedException()
[1] worker_from_id(::Distributed.ProcessGroup, ::Int64) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/cluster.jl:969
[2] worker_from_id at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/cluster.jl:966 [inlined]
[3] #remotecall_fetch#152 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:406 [inlined]
[4] remotecall_fetch at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:406 [inlined]
[5] call_on_owner at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:479 [inlined]
[6] fetch(::Future) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:511
[7] compute_trans_reward(::Array{Union{MDP, POMDP},1}, ::LocalGIFunctionApproximator{RectangleGrid{6}}, ::Int64) at /mnt/c/Users/PC_2219/Desktop/project_folder/bu_son_olsun/HorizontalCAS/GenerateTable/SolveMDP.jl:97
[8] computeQ(::Array{Union{MDP, POMDP},1}, ::LocalGIFunctionApproximator{RectangleGrid{6}}, ::Int64, ::Int64) at /mnt/c/Users/xxx/Desktop/project_folder/xxxxx/HorizontalCAS/GenerateTable/SolveMDP.jl:105

Worker 3 terminated.ERROR:
LoadError: ProcessExitedException()
[1] try_yieldto(::typeof(Base.ensure_rescheduled), ::Base.RefValue{Task}) at ./event.jl:196
[2] wait() at ./event.jl:255
[3] wait(::Condition) at ./event.jl:46
[4] take_buffered(::Channel{Any}) at ./channels.jl:362
[5] take!(::Channel{Any}) at ./channels.jl:315
[6] take!(::Distributed.RemoteValue) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:576
[7] #remotecall_fetch#149(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Function, ::Distributed.Worker, ::Distributed.RRID) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:375
[8] remotecall_fetch(::Function, ::Distributed.Worker, ::Distributed.RRID, ::Vararg{Any,N} where N) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:371
[9] #remotecall_fetch#152 at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:406 [inlined]
[10] remotecall_fetch at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:406 [inlined]
[11] call_on_owner at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:479 [inlined]
[12] fetch(::Future) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.1/Distributed/src/remotecall.jl:511

Again, can you post the code that leads to this error? While having an error message is useful, not being able to investigate the code leading to it is detrimental to getting help. Purely from the error, it seems like one of the worker processes used by Distributed is crashing, leading to the error. Whether that is because of your cluster or your code, I can’t say.

1 Like