[solved] Can't install MPI.jl in julia 1.0.1/0.7.0 on a CENTOS 7.4

compilation
#1

Hello,
I’m new to Julia. I hope this post is at the right place.

So, I have to install MPI.jl module for one of our users. It doesn’t work. I have installed Linux 64 bits binary versions of julia 0.7.0 and 1.0.1

julia _ _ _ _(_)_ | A fresh approach to technical computing (_) | (_) (_) | Documentation: https://docs.julialang.org _ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help. | | | | | | |/ _ | |
| | || | | | (| | | Version 0.7.0 (2018-08-08 06:46 UTC)
/ |_|||_’_| | Official http://julialang.org/ release
|__/ | x86_64-pc-linux-gnu

(v0.7) pkg> update
Cloning default registries into //.julia/registries
Cloning registry General from “https://github.com/JuliaRegistries/General.git
Updating registry at ~/.julia/registries/General
Updating git-repo https://github.com/JuliaRegistries/General.git
Resolving package versions…

(v0.7) pkg> add https://github.com/JuliaParallel/MPI.jl
Cloning git-repo https://github.com/JuliaParallel/MPI.jl
Updating git-repo https://github.com/JuliaParallel/MPI.jl
Resolving package versions…
Installed URIParser ─ v0.4.0
Installed BinDeps ─── v0.8.10
Installed Compat ──── v1.3.0
Updating ~/.julia/environments/v0.7/Project.toml
[da04e1cc] + MPI v0.7.0 #master (https://github.com/JuliaParallel/MPI.jl)
Updating ~/.julia/environments/v0.7/Manifest.toml
[9e28174c] + BinDeps v0.8.10
[34da2185] + Compat v1.3.0
[da04e1cc] + MPI v0.7.0 #master (https://github.com/JuliaParallel/MPI.jl)
[30578b45] + URIParser v0.4.0
[2a0f44e3] + Base64
[ade2ca70] + Dates
[8bb1440f] + DelimitedFiles
[8ba89e20] + Distributed
[b77e0a4c] + InteractiveUtils
[76f85450] + LibGit2
[8f399da3] + Libdl
[37e2e46d] + LinearAlgebra
[56ddb016] + Logging
[d6f4376e] + Markdown
[a63ad114] + Mmap
[44cfe95a] + Pkg
[de0858da] + Printf
[3fa0cd96] + REPL
[9a3f8284] + Random
[ea8e919c] + SHA
[9e88b42a] + Serialization
[1a1011a3] + SharedArrays
[6462fe0b] + Sockets
[2f01184e] + SparseArrays
[10745b16] + Statistics
[8dfed614] + Test
[cf7118a7] + UUIDs
[4ec0a83e] + Unicode
Building MPI → ~/.julia/packages/MPI/U5ujD/deps/build.log

julia> using MPI
[ Info: Precompiling MPI [da04e1cc-30fd-572f-bb4f-1f8673147195]
ERROR: LoadError: LoadError: UndefVarError: MPI_COMM_NULL not defined
Stacktrace:
[1] top-level scope at none:0
[2] include at ./boot.jl:317 [inlined]
[3] include_relative(::Module, ::String) at ./loading.jl:1038
[4] include at ./sysimg.jl:29 [inlined]
[5] include(::String) at //.julia/packages/MPI/U5ujD/src/MPI.jl:3
[6] top-level scope at none:0
[7] include at ./boot.jl:317 [inlined]
[8] include_relative(::Module, ::String) at ./loading.jl:1038
[9] include(::Module, ::String) at ./sysimg.jl:29
[10] top-level scope at none:2
[11] eval at ./boot.jl:319 [inlined]
[12] eval(::Expr) at ./client.jl:399
[13] top-level scope at ./none:3
in expression starting at //.julia/packages/MPI/U5ujD/src/mpi-base.jl:73
in expression startiing at //.julia/packages/MPI/U5ujD/src/MPI.jl:20
ERROR: Failed to precompile MPI [da04e1cc-30fd-572f-bb4f-1f8673147195] to //.julia/compiled/v0.7/MPI/nO0XF.ji.
Stacktrace:
[1] error(::String) at ./error.jl:33
[2] macro expansion at ./logging.jl:313 [inlined]
[3] compilecache(::Base.PkgId, ::String) at ./loading.jl:1185
[4] _require(::Base.PkgId) at ./logging.jl:311
[5] require(::Base.PkgId) at ./loading.jl:852
[6] macro expansion at ./logging.jl:311 [inlined]
[7] require(::Module, ::Symbol) at ./loading.jl:834`

If I look at the build.log file, everything seems to be OK, no warning or error message. However, I notice that I can’t make usage of ‘module loaded’ recent version of gcc , It takes /usr/bin/gcc !

Regards,
Guy.

0 Likes

#2

The problem with GCC version is that CMake does not always use PATH to find compilers. To fix this, set the CC and FC environment variables to the compilers you want to use and then re-run the build script for MPI.jl. Note that you should use the MPI compiler wrappers for the compilers loaded with module load. I usually do

export CC=`which mpicc`
export FC=`which mpif90`

I’m not sure if this is the problem with MPI_COMM_NULL not being defined, but let’s start here an see what happens.

Also, can you post the contents of ~/.julia/packages/MPI/U5ujD/deps/build.log? I think that is where the output from the MPI.jl build script was redirected to.

1 Like

#3

Thank you for your message
So I fix FC and CC as you do and it works better.
I get the build MPI done, with gcc/gfortran version 8.1.0 instead of the old ones of the system.

(v0.7) pkg> add https://github.com/JuliaParallel/MPI.jl
   Cloning git-repo `https://github.com/JuliaParallel/MPI.jl`
  Updating git-repo `https://github.com/JuliaParallel/MPI.jl`
 Resolving package versions...
 Installed URIParser ─ v0.4.0
 Installed Compat ──── v1.3.0
 Installed BinDeps ─── v0.8.10
  Updating `~/.julia/environments/v0.7/Project.toml`
  [da04e1cc] + MPI v0.7.0 #master (https://github.com/JuliaParallel/MPI.jl)
  Updating `~/.julia/environments/v0.7/Manifest.toml`
  [9e28174c] + BinDeps v0.8.10
  [34da2185] + Compat v1.3.0
  [da04e1cc] + MPI v0.7.0 #master (https://github.com/JuliaParallel/MPI.jl)
  [30578b45] + URIParser v0.4.0
  [2a0f44e3] + Base64 
  [ade2ca70] + Dates 
  [8bb1440f] + DelimitedFiles 
  [8ba89e20] + Distributed 
  [b77e0a4c] + InteractiveUtils 
  [76f85450] + LibGit2 
  [8f399da3] + Libdl 
  [37e2e46d] + LinearAlgebra 
  [56ddb016] + Logging 
  [d6f4376e] + Markdown 
  [a63ad114] + Mmap 
  [44cfe95a] + Pkg 
  [de0858da] + Printf 
  [3fa0cd96] + REPL 
  [9a3f8284] + Random 
  [ea8e919c] + SHA 
  [9e88b42a] + Serialization 
  [1a1011a3] + SharedArrays 
  [6462fe0b] + Sockets 
  [2f01184e] + SparseArrays 
  [10745b16] + Statistics 
  [8dfed614] + Test 
  [cf7118a7] + UUIDs 
  [4ec0a83e] + Unicode 
  Building MPI → `~/.julia/packages/MPI/U5ujD/deps/build.log`

build.log is :

-- The Fortran compiler identification is GNU 8.1.0
-- The C compiler identification is GNU 8.1.0
-- Check for working Fortran compiler: /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpif90
-- Check for working Fortran compiler: /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpif90  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpif90 supports Fortran 90
-- Checking whether /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpif90 supports Fortran 90 -- yes
-- Check for working C compiler: /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpicc
-- Check for working C compiler: /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Git: /trinity/shared/apps/cv-standard/git/2.16.1/bin/git (found version "2.16.1") 
-- Found MPI_C: /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpicc (found version "3.1") 
-- Found MPI_Fortran: /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/bin/mpif90 (found version "3.1") 
-- Found MPI: TRUE (found version "3.1")  
-- Detecting Fortran/C Interface
-- Detecting Fortran/C Interface - Found GLOBAL and MODULE mangling
-- Looking for MPI_Comm_c2f
-- Looking for MPI_Comm_c2f - not found
-- Configuring done
-- Generating done
-- Build files have been written to: /<path>/.julia/packages/MPI/U5ujD/deps/build
Scanning dependencies of target gen_functions
[ 11%] Building C object CMakeFiles/gen_functions.dir/gen_functions.c.o
[ 22%] Linking C executable gen_functions
[ 22%] Built target gen_functions
Scanning dependencies of target gen_constants
[ 33%] Building Fortran object CMakeFiles/gen_constants.dir/gen_constants.f90.o
[ 44%] Linking Fortran executable gen_constants
/usr/bin/ld: warning: libgfortran.so.4, needed by /trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/mvapich2/psm2/gcc72/2.2.3b/lib/libmpifort.so,
 may conflict with libgfortran.so.5
[ 44%] Built target gen_constants
Scanning dependencies of target mpijl-build
[ 55%] Generating mpi-build.jl
[ 55%] Built target mpijl-build
Scanning dependencies of target mpijl
[ 66%] Generating compile-time.jl
[ 66%] Built target mpijl
Scanning dependencies of target juliampi
[ 77%] Building C object CMakeFiles/juliampi.dir/juliampi.c.o
[ 88%] Building Fortran object CMakeFiles/juliampi.dir/test_mpi.f90.o
[100%] Linking Fortran shared library libjuliampi.so
[100%] Built target juliampi
[ 22%] Built target gen_functions
[ 44%] Built target gen_constants
[ 55%] Built target mpijl-build
[ 66%] Built target mpijl
[100%] Built target juliampi
Install the project...
-- Install configuration: ""
-- Installing: /path>/.julia/packages/MPI/U5ujD/deps/src/./compile-time.jl
-- Installing: /path>/.julia/packages/MPI/U5ujD/deps/usr/lib/libjuliampi.so
[ Info: Attempting to create directory /home/CCIPL/<path>/.julia/packages/MPI/U5ujD/deps/build
[ Info: Changing directory to /path>/.julia/packages/MPI/U5ujD/deps/build

The command “using MPI” works fine now.

using MPI
[ Info: Precompiling MPI [da04e1cc-30fd-572f-bb4f-1f8673147195]

So I exit julia, and I launch on the command line the first example (https://github.com/JuliaParallel/MPI.jl/blob/master/examples/01-hello.jl) :

 mpirun -np 3 julia 01-hello.jl
Hello world, I am 0 of 3
Hello world, I am 1 of 3
Hello world, I am 2 of 3
[0] 512 at [0x0000000002260ec8], d/ch3/channels/common/src/affinity/hwloc_bind.c[2090]
[2] 512 at [0x000000000235f7a8], d/ch3/channels/common/src/affinity/hwloc_bind.c[2090]
[1] 512 at [0x0000000000d4e958], d/ch3/channels/common/src/affinity/hwloc_bind.c[2090]

My MPI implementation is MVAPICH2, v 2.2.3b
I will retry with OpenMPI to see if I get the same final behavior.

Thank you for your help.
Regards,

0 Likes

#4

It’s working fine with OpenMPI.
Thank you for your help.
Regards,

0 Likes

#5

This does not work for me with OpenMPI. Have been dealing with this for multiple weeks now.

(v1.0) pkg> add MPI
 Resolving package versions...
  Updating `~/.julia/environments/v1.0/Project.toml`
  [da04e1cc] + MPI v0.7.2
  Updating `~/.julia/environments/v1.0/Manifest.toml`
  [da04e1cc] + MPI v0.7.2

(v1.0) pkg> precompile
Precompiling project...
Precompiling MPI
[ Info: Precompiling MPI [da04e1cc-30fd-572f-bb4f-1f8673147195]
ERROR: LoadError: LoadError: UndefVarError: MPI_COMM_NULL not defined
Stacktrace:
 [1] top-level scope at none:0
 [2] include at ./boot.jl:317 [inlined]
 [3] include_relative(::Module, ::String) at ./loading.jl:1041
 [4] include at ./sysimg.jl:29 [inlined]
 [5] include(::String) at /home/ubuntu/.julia/packages/MPI/U5ujD/src/MPI.jl:3
 [6] top-level scope at none:0
 [7] include at ./boot.jl:317 [inlined]
 [8] include_relative(::Module, ::String) at ./loading.jl:1041
 [9] include(::Module, ::String) at ./sysimg.jl:29
 [10] top-level scope at none:2
 [11] eval at ./boot.jl:319 [inlined]
 [12] eval(::Expr) at ./client.jl:389
 [13] top-level scope at ./none:3
in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/src/mpi-base.jl:73
in expression starting at /home/ubuntu/.julia/packages/MPI/U5ujD/src/MPI.jl:20
ERROR: Failed to precompile MPI [da04e1cc-30fd-572f-bb4f-1f8673147195] to /home/ubuntu/.julia/compiled/v1.0/MPI/nO0XF.ji.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] macro expansion at ./logging.jl:313 [inlined]
 [3] compilecache(::Base.PkgId, ::String) at ./loading.jl:1187
 [4] precompile(::Pkg.Types.Context) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/API.jl:506
 [5] do_precompile!(::Dict{Symbol,Any}, ::Array{String,1}, ::Dict{Symbol,Any}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:662
 [6] #invokelatest#1(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::Any, ::Any, ::Vararg{Any,N} where N) at ./essentials.jl:697
 [7] invokelatest(::Any, ::Any, ::Vararg{Any,N} where N) at ./essentials.jl:696
 [8] do_cmd!(::Pkg.REPLMode.PkgCommand, ::REPL.LineEditREPL) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:603
 [9] #do_cmd#33(::Bool, ::Function, ::REPL.LineEditREPL, ::String) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:577
 [10] do_cmd at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:573 [inlined]
 [11] (::getfield(Pkg.REPLMode, Symbol("##44#47")){REPL.LineEditREPL,REPL.LineEdit.Prompt})(::REPL.LineEdit.MIState, ::Base.GenericIOBuffer{Array{UInt8,1}}, ::Bool) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/Pkg/src/REPLMode.jl:912
 [12] #invokelatest#1 at ./essentials.jl:697 [inlined]
 [13] invokelatest at ./essentials.jl:696 [inlined]
 [14] run_interface(::REPL.Terminals.TextTerminal, ::REPL.LineEdit.ModalInterface, ::REPL.LineEdit.MIState) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/LineEdit.jl:2261
 [15] run_frontend(::REPL.LineEditREPL, ::REPL.REPLBackendRef) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/REPL.jl:1029
 [16] run_repl(::REPL.AbstractREPL, ::Any) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.0/REPL/src/REPL.jl:191
 [17] (::getfield(Base, Symbol("##719#721")){Bool,Bool,Bool,Bool})(::Module) at ./logging.jl:311
 [18] #invokelatest#1 at ./essentials.jl:697 [inlined]
 [19] invokelatest at ./essentials.jl:696 [inlined]
 [20] macro expansion at ./logging.jl:308 [inlined]
 [21] run_main_repl(::Bool, ::Bool, ::Bool, ::Bool, ::Bool) at ./client.jl:330
 [22] exec_options(::Base.JLOptions) at ./client.jl:242
 [23] _start() at ./client.jl:421
0 Likes

#6

Can you post the contents of the file /home/ubuntu/.julia/packages/MPI/deps/src/compile-time.jl?

0 Likes

#7
const libmpi = "/home/ubuntu/.julia/packages/MPI/U5ujD/deps/usr/lib/libjuliampi"

using Compat
import Compat.String


const _mpi_functions = Dict{Symbol, String}(
    :MPI_ABORT              => "mpi_abort_",
    :MPI_ACCUMULATE         => "mpi_accumulate_",
    :MPI_ALLGATHER          => "mpi_allgather_",
    :MPI_ALLGATHERV         => "mpi_allgatherv_",
    :MPI_ALLREDUCE          => "mpi_allreduce_",
    :MPI_ALLTOALL           => "mpi_alltoall_",
    :MPI_ALLTOALLV          => "mpi_alltoallv_",
    :MPI_BARRIER            => "mpi_barrier_",
    :MPI_BCAST              => "mpi_bcast_",
    :MPI_BSEND              => "mpi_bsend_",
    :MPI_CANCEL             => "mpi_cancel_",
    :MPI_COMM_DUP           => "mpi_comm_dup_",
    :MPI_COMM_FREE          => "mpi_comm_free_",
    :MPI_COMM_GET_PARENT    => "mpi_comm_get_parent_",
    :MPI_COMM_RANK          => "mpi_comm_rank_",
    :MPI_COMM_SIZE          => "mpi_comm_size_",
    :MPI_COMM_SPLIT         => "mpi_comm_split_",
    :MPI_COMM_SPLIT_TYPE    => "mpi_comm_split_type_",
    :MPI_EXSCAN             => "mpi_exscan_",
    :MPI_FETCH_AND_OP       => "mpi_fetch_and_op_",
    :MPI_FINALIZE           => "mpi_finalize_",
    :MPI_FINALIZED          => "mpi_finalized_",
    :MPI_GATHER             => "mpi_gather_",
    :MPI_GATHERV            => "mpi_gatherv_",
    :MPI_GET                => "mpi_get_",
    :MPI_GET_ACCUMULATE     => "mpi_get_accumulate_",
    :MPI_GET_ADDRESS        => "mpi_get_address_",
    :MPI_GET_COUNT          => "mpi_get_count_",
    :MPI_GET_PROCESSOR_NAME => "mpi_get_processor_name_",
    :MPI_INFO_CREATE        => "mpi_info_create_",
    :MPI_INFO_DELETE        => "mpi_info_delete_",
    :MPI_INFO_FREE          => "mpi_info_free_",
    :MPI_INFO_GET           => "mpi_info_get_",
    :MPI_INFO_GET_VALUELEN  => "mpi_info_get_valuelen_",
    :MPI_INFO_SET           => "mpi_info_set_",
    :MPI_INIT               => "mpi_init_",
    :MPI_INITIALIZED        => "mpi_initialized_",
    :MPI_INTERCOMM_MERGE    => "mpi_intercomm_merge_",
    :MPI_IPROBE             => "mpi_iprobe_",
    :MPI_IRECV              => "mpi_irecv_",
    :MPI_ISEND              => "mpi_isend_",
    :MPI_OP_CREATE          => "mpi_op_create_",
    :MPI_OP_FREE            => "mpi_op_free_",
    :MPI_PACK               => "mpi_pack_",
    :MPI_PACK_SIZE          => "mpi_pack_size_",
    :MPI_PROBE              => "mpi_probe_",
    :MPI_PUT                => "mpi_put_",
    :MPI_RECV               => "mpi_recv_",
    :MPI_RECV_INIT          => "mpi_recv_init_",
    :MPI_REDUCE             => "mpi_reduce_",
    :MPI_REQUEST_FREE       => "mpi_request_free_",
    :MPI_RSEND              => "mpi_rsend_",
    :MPI_SCAN               => "mpi_scan_",
    :MPI_SCATTER            => "mpi_scatter_",
    :MPI_SCATTERV           => "mpi_scatterv_",
    :MPI_SEND               => "mpi_send_",
    :MPI_SEND_INIT          => "mpi_send_init_",
    :MPI_SSEND              => "mpi_ssend_",
    :MPI_TEST               => "mpi_test_",
    :MPI_TESTALL            => "mpi_testall_",
    :MPI_TESTANY            => "mpi_testany_",
    :MPI_TESTSOME           => "mpi_testsome_",
    :MPI_UNPACK             => "mpi_unpack_",
    :MPI_WAIT               => "mpi_wait_",
    :MPI_WAITALL            => "mpi_waitall_",
    :MPI_WAITANY            => "mpi_waitany_",
    :MPI_WAITSOME           => "mpi_waitsome_",
    :MPI_WIN_ATTACH         => "mpi_win_attach_",
    :MPI_WIN_CREATE         => "mpi_win_create_",
    :MPI_WIN_CREATE_DYNAMIC => "mpi_win_create_dynamic_",
    :MPI_WIN_DETACH         => "mpi_win_detach_",
    :MPI_WIN_FENCE          => "mpi_win_fence_",
    :MPI_WIN_FLUSH          => "mpi_win_flush_",
    :MPI_WIN_FREE           => "mpi_win_free_",
    :MPI_WIN_LOCK           => "mpi_win_lock_",
    :MPI_WIN_SYNC           => "mpi_win_sync_",
    :MPI_WIN_UNLOCK         => "mpi_win_unlock_",
    :MPI_WTICK              => "mpi_wtick_",
    :MPI_WTIME              => "mpi_wtime_",
    :MPI_TYPE_CREATE_STRUCT => "mpi_type_create_struct_",
    :MPI_TYPE_COMMIT        => "mpi_type_commit_",
)

primitive type CComm 64 end
primitive type CInfo 64 end
primitive type CWin 64 end


const HAVE_MPI_COMM_C2F = true

Thanks for your prompt response!

0 Likes

#8

Are you sure the build of MPI.jl completed successfully (have a look at deps/build.log)? There is a section of the file missing.

In addition to looking at the build log, can you check if deps/build/gen_constants exists? If so, can you run it and post the output? This is the executable that prints out the information that goes into the missing section of compile-time.jl.

0 Likes

#9

Build is successful

-- The Fortran compiler identification is GNU 5.5.0
-- The C compiler identification is GNU 5.5.0
-- Check for working Fortran compiler: /home/linuxbrew/.linuxbrew/bin/mpif90
-- Check for working Fortran compiler: /home/linuxbrew/.linuxbrew/bin/mpif90  -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Checking whether /home/linuxbrew/.linuxbrew/bin/mpif90 supports Fortran 90
-- Checking whether /home/linuxbrew/.linuxbrew/bin/mpif90 supports Fortran 90 -- yes
-- Check for working C compiler: /home/linuxbrew/.linuxbrew/bin/mpicc
-- Check for working C compiler: /home/linuxbrew/.linuxbrew/bin/mpicc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Found Git: /usr/bin/git (found version "2.7.4") 
-- Found MPI_C: /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi.so  
-- Found MPI_Fortran: /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_usempif08.so;/home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_usempi_ignore_tkr.so;/home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_mpifh.so;/home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi.so  
-- Detecting Fortran/C Interface
-- Detecting Fortran/C Interface - Found GLOBAL and MODULE mangling
-- Looking for MPI_Comm_c2f
-- Looking for MPI_Comm_c2f - found
-- Configuring done
-- Generating done
-- Build files have been written to: /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
Scanning dependencies of target gen_constants
[ 11%] Building Fortran object CMakeFiles/gen_constants.dir/gen_constants.f90.o
[ 22%] Linking Fortran executable gen_constants
/usr/bin/ld: warning: libgfortran.so.3, needed by /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libmpi_usempif08.so, may conflict with libgfortran.so.4
[ 22%] Built target gen_constants
Scanning dependencies of target gen_functions
[ 33%] Building C object CMakeFiles/gen_functions.dir/gen_functions.c.o
[ 44%] Linking C executable gen_functions
[ 44%] Built target gen_functions
Scanning dependencies of target mpijl-build
[ 55%] Generating mpi-build.jl
/home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/gen_constants: error while loading shared libraries: libgfortran.so.4: cannot open shared object file: No such file or directory
[ 55%] Built target mpijl-build
Scanning dependencies of target mpijl
[ 66%] Generating compile-time.jl
/home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/gen_constants: error while loading shared libraries: libgfortran.so.4: cannot open shared object file: No such file or directory
[ 66%] Built target mpijl
Scanning dependencies of target juliampi
[ 77%] Building C object CMakeFiles/juliampi.dir/juliampi.c.o
[ 88%] Building Fortran object CMakeFiles/juliampi.dir/test_mpi.f90.o
[100%] Linking Fortran shared library libjuliampi.so
[100%] Built target juliampi
[ 22%] Built target gen_constants
[ 44%] Built target gen_functions
[ 55%] Built target mpijl-build
[ 66%] Built target mpijl
[100%] Built target juliampi
Install the project...
-- Install configuration: ""
-- Installing: /home/ubuntu/.julia/packages/MPI/U5ujD/deps/src/./compile-time.jl
-- Installing: /home/ubuntu/.julia/packages/MPI/U5ujD/deps/usr/lib/libjuliampi.so
[ Info: Attempting to create directory /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build
[ Info: Changing directory to /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build

Output of deps/build/gen_constants is

./gen_constants: error while loading shared libraries: libgfortran.so.4: cannot open shared object file: No such file or directory

0 Likes

#10

Not finding libgfortran is the root cause of the problem. The fact that the build system failed to execute gen_constants but reported success anyways is another problem.

With other software, I have solved issues like this by making sure the /lib directory where the compiler is installed is in my LD_LIBRARY_PATH. If you have more than once compiler installed, make sure it is the /lib directory of the compiler mpif90 is wrapping.

Something you can do to check if your current environment is set correctly is ldd ./gen_constants. This prints out all the libraries the exectuable is dynamically linked to and their locations, and will print not found if it can’t find one.

0 Likes

#11

There seems to be a problem. Here is the content of /home/ubuntu/.linuxbrew/lib/ where libgfortran.so.4 should be located but it is missing completely …

lrwxrwxrwx  1 ubuntu ubuntu   37 Feb 27 22:59 libgcc_s.so -> ../Cellar/gcc/5.5.0_4/lib/libgcc_s.so*
lrwxrwxrwx  1 ubuntu ubuntu   39 Feb 27 22:59 libgcc_s.so.1 -> ../Cellar/gcc/5.5.0_4/lib/libgcc_s.so.1*
lrwxrwxrwx  1 ubuntu ubuntu   39 Feb 27 22:59 libgfortran.a -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.a
lrwxrwxrwx  1 ubuntu ubuntu   40 Feb 27 22:59 libgfortran.so -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.so*
lrwxrwxrwx  1 ubuntu ubuntu   42 Feb 27 22:59 libgfortran.so.3 -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.so.3*
lrwxrwxrwx  1 ubuntu ubuntu   46 Feb 27 22:59 libgfortran.so.3.0.0 -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.so.3.0.0*
lrwxrwxrwx  1 ubuntu ubuntu   42 Feb 27 22:59 libgfortran.spec -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.spec

Here is ldd ./gen_constants

uche:~$ ldd /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/gen_constants 
	linux-vdso.so.1 =>  (0x00007ffce83e3000)
	libmpi_usempif08.so.40 => /home/linuxbrew/.linuxbrew/lib/libmpi_usempif08.so.40 (0x00007f84d8828000)
	libmpi_usempi_ignore_tkr.so.40 => /home/linuxbrew/.linuxbrew/lib/libmpi_usempi_ignore_tkr.so.40 (0x00007f84d861f000)
	libmpi_mpifh.so.40 => /home/linuxbrew/.linuxbrew/lib/libmpi_mpifh.so.40 (0x00007f84d83b2000)
	libmpi.so.40 => /home/linuxbrew/.linuxbrew/lib/libmpi.so.40 (0x00007f84d8096000)
	libgfortran.so.4 => not found
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f84d7d8d000)
	libgcc_s.so.1 => /home/linuxbrew/.linuxbrew/lib/libgcc_s.so.1 (0x00007f84d7b76000)
	libquadmath.so.0 => /home/linuxbrew/.linuxbrew/lib/libquadmath.so.0 (0x00007f84d7937000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f84d771a000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f84d7350000)
	libopen-rte.so.40 => /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-rte.so.40 (0x00007f84d7090000)
	libopen-pal.so.40 => /home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.0/lib/libopen-pal.so.40 (0x00007f84d6d96000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f84d6b92000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007f84d698a000)
	libutil.so.1 => /lib/x86_64-linux-gnu/libutil.so.1 (0x00007f84d6787000)
	libz.so.1 => /home/linuxbrew/.linuxbrew/lib/libz.so.1 (0x00007f84d6572000)
	libevent-2.1.so.6 => /home/linuxbrew/.linuxbrew/lib/libevent-2.1.so.6 (0x00007f84d6329000)
	libevent_pthreads-2.1.so.6 => /home/linuxbrew/.linuxbrew/lib/libevent_pthreads-2.1.so.6 (0x00007f84d6125000)
	libgfortran.so.3 => /home/linuxbrew/.linuxbrew/lib/libgfortran.so.3 (0x00007f84d5dfc000)
	/home/linuxbrew/.linuxbrew/lib/ld.so => /lib64/ld-linux-x86-64.so.2 (0x00007f84d8a62000)

And gcc is fully installed

uche:~$ brew install gcc --with-glibc
Warning: gcc 5.5.0_4 is already installed and up-to-date
To reinstall 5.5.0_4, run `brew reinstall gcc````
0 Likes

#12

That’s odd. It looks like you are using linuxbrew to install stuff. My guess is something went wrong betweeen the gcc and mpi installations. I don’t have any experience with linuxbrew, but my next suggestion is to find where libgfortran.so.4 is installed (ldconfig -p may be helpful here), and hopefully that will give you a hint as to what was happening at link time for gen_constants.

0 Likes

#13

So I installed gcc-7 using Linuxbrew

Then linked libgfortran.so.4 where the others were located /home/ubuntu/.linuxbrew/lib:

lrwxrwxrwx  1 ubuntu ubuntu   37 Feb 27 22:59 libgcc_s.so -> ../Cellar/gcc/5.5.0_4/lib/libgcc_s.so*
lrwxrwxrwx  1 ubuntu ubuntu   39 Feb 27 22:59 libgcc_s.so.1 -> ../Cellar/gcc/5.5.0_4/lib/libgcc_s.so.1*
lrwxrwxrwx  1 ubuntu ubuntu   39 Feb 27 22:59 libgfortran.a -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.a
lrwxrwxrwx  1 ubuntu ubuntu   40 Feb 27 22:59 libgfortran.so -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.so*
lrwxrwxrwx  1 ubuntu ubuntu   42 Feb 27 22:59 libgfortran.so.3 -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.so.3*
lrwxrwxrwx  1 ubuntu ubuntu   46 Feb 27 22:59 libgfortran.so.3.0.0 -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.so.3.0.0*
lrwxrwxrwx  1 ubuntu ubuntu   69 Feb 28 00:11 libgfortran.so.4 -> /home/ubuntu/.linuxbrew/Cellar/gcc@7/7.4.0/lib/gcc/7/libgfortran.so.4*
lrwxrwxrwx  1 ubuntu ubuntu   73 Feb 28 00:12 libgfortran.so.4.0.0 -> /home/ubuntu/.linuxbrew/Cellar/gcc@7/7.4.0/lib/gcc/7/libgfortran.so.4.0.0*
lrwxrwxrwx  1 ubuntu ubuntu   42 Feb 27 22:59 libgfortran.spec -> ../Cellar/gcc/5.5.0_4/lib/libgfortran.spec

Still it is not found! This is weird…

uche:~$ ldd /home/ubuntu/.julia/packages/MPI/U5ujD/deps/build/gen_constants 
	linux-vdso.so.1 =>  (0x00007ffcd7da4000)
	libgfortran.so.4 => not found
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fd950517000)
	/lib64/ld-linux-x86-64.so.2 (0x00007fd9508e1000)

So if you are not using linuxbrew to install openmpi, what are you using? Cos the openmpi in Ubuntu repo is way too old and causing problems

0 Likes

#14

Finally solved!!!

I found Ubuntu repo for libgfortran.so.4

sudo add-apt-repository ppa:jonathonf/gcc-7.1
sudo apt-get update

sudo apt-get install gcc-7 g++-7
sudo apt-get install gfortran-7

And everything was fixed!

Thanks.

0 Likes