Build Julia on NixOS

Hello @cstich, do you know if eventually we could improve this nix code to have a derivation whose Julia packager works smoothly in NixOS? I see here several dependencies but I guess only to compile Julia, not the external dependencies of the Julia package manager. Or maybe this has to be done only with overlays?

1 Like

@cstich, I ran this expression (putting it in a default.nix file) using

EDIT: I will put the working command(s) here now to avoid further confusion:

curl -O https://raw.githubusercontent.com/NixOS/nixpkgs/d65bcef36d38b77ecedd2c4b779c4faeb0c8e6d4/pkgs/development/compilers/julia/use-system-utf8proc-julia-1.3.patch
curl -O https://raw.githubusercontent.com/NixOS/nixpkgs/d65bcef36d38b77ecedd2c4b779c4faeb0c8e6d4/pkgs/development/compilers/julia/allow_nix_mtime.patch
nix-build -E "with import <nixpkgs> {}; callPackage ./. {}" -I nixpkgs=channel:nixos-20.03

Run the ./result/bin/julia when ready.

Note that in case you get an error as I got on WSL along the lines:

$ā€˜\rā€™: command not found

Youā€™ll have to pipe your default.nix through sed as in

sed -i.bak 's/\r$//' default.nix

There are new issues with the built julia.

Assume I am running

nix-shell -p gobjectIntrospection gtk3 gst_all_1.gstreamer 'python3.withPackages (p: with p; [matplotlib pygobject3 gst-python numpy])' curl which

In the nix-shell

which python3.7

/nix/store/cbc9hfdvvrfl6007zbdhp6ypa1s3p0mk-python3-3.7.7-env/bin/python3.7

So I can $(which python3.7) and therein

>>> import numpy.random
>>> numpy.random.rand(1,2)

getting

array([[0.18651245, 0.81832922]])

As opposed to when Iā€™m running in the same nix-shell session

$ PYTHON=$(which python3.7) ./result/bin/julia
julia> using PyCall
julia> @pyimport numpy.random as rng

where Iā€™m getting

ERROR: PyError (PyImport_ImportModule
(ā€¦)
PyCall is currently configured to use the Python version at:

/nix/store/vxdbn33pk2f3j4sxs9mim0qfmxp3nr2r-python3-3.7.7-env/bin/python3.7
(ā€¦)
) <class ā€˜ModuleNotFoundErrorā€™>
ModuleNotFoundError(ā€œNo module named ā€˜numpyā€™ā€)

Thus Iā€™m running

julia> ENV["PYTHON"]="/nix/store/cbc9hfdvvrfl6007zbdhp6ypa1s3p0mk-python3-3.7.7-env/bin/python3.7"; using Pkg; Pkg.build("PyCall");

but that doesnā€™t change the error.

The Julia package manager works on Nix what does not work is how the package manager resolves missing binary dependencies. Currently you need to manually provide all the required dependencies (or alternatively patch the binaries the Julia package manager downloads) in the environment within you want to use Julia.

Recreating something like conda-shell is probably your best bet for a convenient user experience.

PyCall only queries/changes the Python path when you build it. On my system I need to do something like this to make sure that PyCall picks up on the correct Python path in my nix expressions.

shellHook = ''
  julia -e 'ENV["PYTHON"]="${python-stuff}/bin/python"; using Pkg; Pkg.activate("./"); Pkg.build("PyCall")'
''

where python-stuff is the python3.withPackages myPackages part.

Edit: I am an idiot and I should read comments more thoroughly and I am not sure why exactly you get the error you have. If you want I can dump my whole julia nix environment here and you have a look?

Edit2: Actually, have you tried setting the PYTHONPATH variable yet? I do have a comment there saying I need it.

# Set PYTHONPATH so that PyCall in julia finds the relevant packages
export PYTHONPATH=${python-stuff}/lib/python3.7/site-packages/ 
1 Like

@cstich thank you very much for reaching out.
I would really love to examine your julia environment to get it on running on my setup as well. So if you could dump it here itā€™d be very kind.

Can you help me with the shellHook pls ? Is it to be placed into the build expression or where else ?

I tried your suggestions but came only so far to hack together something yet:

First fired up a nix-shell setting PYTHONPATH therein (./result-safe/bin/julia is the exact result of building your nice nix expression):

$ nix-shell -p which curl 'python3.withPackages (p: with p; [ numpy ])' --pure
$ PYTHONPATH=$(realpath $(dirname $(type -p python3))/../lib)/python3.7/site-packages ./result-safe/bin/julia

In julia then:

julia> using Pkg; Pkg.build("PyCall"); using PyCall
julia> @pyimport numpy
julia> numpy.random.rand(0,1)

I canā€™t even really tell why this works. Especially with the static /python3.7/site-packages part in PYTHONPATH retrieval Iā€™m still uncertain.

Does it work now for you?

The shellHook is an option when building with pkgs.stdenv.mkDerivation.

And sure here is my whole mess of a nix expression I use for my julia environment.

with import <nixpkgs> {};

let
  
  unstable = import <nixos-unstable> {};
  IJulia = "/home/christoph/.julia/packages/IJulia/DrVMH";
  config = { display_name = "Julia Nix";
              argv = [
                "${julia}/bin/julia" 
                "-i"
                "--startup-file=yes"
                "--color=yes"
                "--project=@."
                "${IJulia}/src/kernel.jl"
                "{connection_file}"
              ];
              language = "julia";
              interrupt_mode = "signal";
            };
  configFile = writeText "kernel.json" (builtins.toJSON config);

  myPackages = pythonPackages: with pythonPackages; [
    # Install python dependencies
    asdf
    cython
    jupyter 
    matplotlib
    numpy
    pyemd
    scipy
    scikitlearn
    setuptools
    ortools
    pandas
    pip
    pyemd

    # coc-vim dependencies
    black
    mypy
    pylama
    pylint

     ];
  python-stuff = python3.withPackages myPackages;

  extraLibs = [
    cudatoolkit
    qt4
    glibc
    cairo
  ];


  libPath = lib.makeLibraryPath [
    qt4 
    gcc9 
    stdenv.cc.cc.lib
  ];

in

# pkgs.mkShell {
  pkgs.stdenv.mkDerivation {
  name = "sandbox-julia";
  buildInputs = with pkgs; [
    julia
    python-stuff
    # Apparently we need a curl install here or otherwise this happens:
    # https://github.com/JuliaPackaging/BinaryBuilder.jl/issues/527
    curl 
    zlib
    zlib.dev 
    zlib.out
    
    cmake
    llvm_8

    cudatoolkit
    git
    gitRepo
    gnumake
    gnupg
    gperf
    libGLU
    linuxPackages.nvidia_x11
    m4
    ncurses5
    procps
    unzip
    utillinux
    xorg.libX11
    xorg.libXext
    xorg.libXi
    xorg.libXmu
    xorg.libXrandr
    xorg.libXv
    zlib
 
    neovim 
     ];
  shellHook = ''
    WORKING_DIR=$PWD
    echo $WORKING_DIR
    # CUDA shell hooks
    export CUDA_PATH=${pkgs.cudatoolkit}
    export EXTRA_LDFLAGS="-L/lib -L${pkgs.linuxPackages.nvidia_x11}/lib"
    export EXTRA_CCFLAGS="-I/usr/include"

    # Stuff for julia jupyter kernel
    # julia -e 'using Pkg; Pkg.add("IJulia")'
    
    # Python stuff for coc-vim
    rm -f env
    ln -s ${python-stuff}/bin env

    rm -f env_julia
    ln -s ${julia} env_julia

    # Set PYTHONPATH so that PyCall in julia finds the relevant packages
    export PYTHONPATH=${python-stuff}/lib/python3.7/site-packages/ 

    # Setup a local pip build directory
    alias pip="PIP_PREFIX='$(pwd)/_build/pip_packages' \pip"
    export PYTHONPATH="$(pwd)/_build/pip_packages/lib/python3.7/site-packages:$PYTHONPATH"
    unset SOURCE_DATE_EPOCH

    FIRST_RUN=false
    if [ "$FIRST_RUN" = true ]; then 
      # Pip install python pacakges here
      pip install git+https://github.com/rflamary/POT
      pip install gputil

      export CUDA_TOOLKIT_ROOT_DIR=${cudatoolkit}
      export CUDA_ARCH=52
      rm -rf _build/pip_packages/lib/python3.7/site-packages/libKMCUDA
      mkdir -p _build/pip_packages/lib/python3.7/site-packages/libKMCUDA
      git clone --depth=1 https://github.com/src-d/kmcuda $(pwd)/_build/pip_packages/lib/python3.7/site-packages/libKMCUDA
      patch _build/pip_packages/lib/python3.7/site-packages/libKMCUDA/src/setup.py < libKMCUDA_CUDA_ARCH.patch
      pip uninstall -y libKMCUDA
      pip install _build/pip_packages/lib/python3.7/site-packages/libKMCUDA/src
    fi

    # cd $WORKING_DIR
    # Install jupyter extensions
    # Create required directory in case (optional)
    mkdir -p $(jupyter --data-dir)/nbextensions
    # Clone the repository
    cd $(jupyter --data-dir)/nbextensions
    git clone --depth 1 https://github.com/lambdalisue/jupyter-vim-binding vim_binding
    # Activate the extension
    jupyter nbextension enable vim_binding/vim_binding
    # Go back to the working dir
    cd $WORKING_DIR
    
    # manually setup the kernel
    # TODO figure out how to use jupyter-kernel.create
    KERNEL_DIR=~/.local/share/jupyter/kernels/julia
    mkdir -p $KERNEL_DIR
    ln -sf ${configFile} $KERNEL_DIR/kernel.json
    ln -sf ${IJulia}/deps/logo-32x32.png $KERNEL_DIR/logo-32x32.png
    ln -sf ${IJulia}/deps/logo-64x64.png $KERNEL_DIR/logo-64x64.png

    # Julia Threads
    export JULIA_NUM_THREADS=12

    # The Cmake binary fails, so we have to build it from source
    # julia -e 'ENV["CMAKE_JL_BUILD_FROM_SOURCE"] = 1'
    export CMAKE_JL_BUILD_FROM_SOURCE=1

    # Make sure CUDAnative does not use BinaryBuilder
    export JULIA_CUDA_USE_BINARYBUILDER=false

    # julia -e 'using Pkg; Pkg.activate("./"); Pkg.add("GR")'
    # Patch the GKS binary for GR
     patchelf \
     --set-interpreter ${glibc}/lib/ld-linux-x86-64.so.2 \
     --set-rpath "${libPath}" \
     /home/christoph/.julia/packages/GR/cRdXQ/deps/gr/bin/gksqt

    # Configure PyCall to pick up the correct python binary
    # julia -e 'ENV["PYTHON"]="${python-stuff}/bin/python"; using Pkg; Pkg.activate("./"); Pkg.build("PyCall")'

    # Everytime you get a new julia binary, we need to nuke the julia package cache...
    # Maybe?
    # rm -rf /home/christoph/.julia/
  '';
}
1 Like

Hi @cstich I had some progress using your latest posted config.

To your question if my former setup worked. Partly it did, yes. I only had to i. e. spawn an extra nix-shell with python dependencies and PYTHONPATH set to have julia recognize my nix-managed python as you wrote.

No to your latest config dumped. I post my copy of it bc I did minor modifications to fit my system, but I would dare saying that it works. Basically it ended up being a merge of your expressions posted here and here.

I built as usual with nix-build --pure -E "with import <nixpkgs> {}; callPackage ./. {}" -I nixpkgs=channe l:nixos-20.03 and IDK exactly if that is the correct way of building it but seems to work.

# potentially dangerous mods I remember were:
# cmake as a buildInput seemed to break the build, removed
# cuda-related stuff lead to the build endlessly "building '/nix/store/0x9qbwh8xi5i8iagivl0nnsxcnwbijcm-cuda_10.2.89_440.33.01_linux.run.drv'...", removed
# hard setting USE_BLAS64=1 as otherwise got a warning I don't remember the exact contents of (openblas.blas64)
# https://discourse.julialang.org/t/using-julia-with-nixos/35129/20
{ stdenv, fetchurl, fetchzip, fetchFromGitHub
# build tools
, gfortran, m4, makeWrapper, patchelf, perl, which, python2
, cmake
# libjulia dependencies
, libunwind, readline, utf8proc, zlib
# standard library dependencies
, curl, fftwSinglePrec, fftw, gmp, libgit2, mpfr, openlibm, openspecfun, pcre2
# linear algebra
, openblas, arpack
# llvm
, llvm
, julia
, git, gitRepo, gnumake, gnupg, gperf, libGLU, ncurses5, procps, unzip, utillinux, neovim, python3
, xorg
}:
with import <nixpkgs> {};
with stdenv.lib;

# All dependencies must use the same OpenBLAS.
let
  arpack_ = arpack;
in
let
  arpack = arpack_.override { inherit openblas; };
in
  let
  IJulia = "/home/dkahlenberg/.julia/packages/IJulia/DrVMH";
  config = { display_name = "Julia Nix";
              argv = [
                "${julia}/bin/julia"
                "-i"
                "--startup-file=yes"
                "--color=yes"
                "--project=@."
                "${IJulia}/src/kernel.jl"
                "{connection_file}"
              ];
              language = "julia";
              interrupt_mode = "signal";
            };
  configFile = writeText "kernel.json" (builtins.toJSON config);
  myPackages = pythonPackages: with pythonPackages; [
    # Install python dependencies
    #asdf
    cython
    jupyter
    matplotlib
    numpy
    pyemd
    scipy
    scikitlearn
    setuptools
    ortools
    pandas
    pip
    pyemd

    # coc-vim dependencies
    black
    mypy
    pylama
    pylint

     ];
  python-stuff = python3.withPackages myPackages;

  extraLibs = [
    qt4
    glibc
    cairo
  ];


  libPath = lib.makeLibraryPath [
    qt4
    gcc9
    stdenv.cc.cc.lib
  ];
  majorVersion = "1";
  minorVersion = "5";
  maintenanceVersion = "0";
  src_sha256 = "1x10q46q2bkr19nqgc05kx8sachh8dz8v7qrvqwdagr4g8yxmqrs";
  version = "${majorVersion}.${minorVersion}.${maintenanceVersion}";
in

stdenv.mkDerivation rec {
  pname = "julia";
  inherit version;

  src = fetchzip {
     url = "https://github.com/JuliaLang/julia/releases/download/v1.5.0-rc1/julia-1.5.0-rc1-full.tar.gz";
     sha256 = src_sha256;
   };

  prePatch = ''
    export PATH=$PATH:${cmake}/bin
    '';

  patches = [
    ./use-system-utf8proc-julia-1.3.patch

    # Julia recompiles a precompiled file if the mtime stored *in* the
    # .ji file differs from the mtime of the .ji file.  This
    # doesn't work in Nix because Nix changes the mtime of files in
    # the Nix store to 1. So patch Julia to accept mtimes of 1.
    ./allow_nix_mtime.patch
  ];

  postPatch = ''
    patchShebangs . contrib
    for i in backtrace cmdlineargs; do
      mv test/$i.jl{,.off}
      touch test/$i.jl
    done
    rm stdlib/Sockets/test/runtests.jl && touch stdlib/Sockets/test/runtests.jl
    rm stdlib/Distributed/test/runtests.jl && touch stdlib/Distributed/test/runtests.jl
    # LibGit2 fails with a weird error, so we skip it as well now
    rm stdlib/LibGit2/test/runtests.jl && touch stdlib/LibGit2/test/runtests.jl
    sed -e 's/Invalid Content-Type:/invalid Content-Type:/g' -i ./stdlib/LibGit2/test/libgit2.jl
    sed -e 's/Failed to resolve /failed to resolve /g' -i ./stdlib/LibGit2/test/libgit2.jl
  '';

  buildInputs = [
    arpack fftw fftwSinglePrec gmp libgit2 libunwind mpfr
    pcre2.dev openblas openlibm openspecfun readline utf8proc
    zlib

    julia
    python-stuff
    # Apparently we need a curl install here or otherwise this happens:
    # https://github.com/JuliaPackaging/BinaryBuilder.jl/issues/527
    curl
    zlib
    zlib.dev
    zlib.out

    git
    gitRepo
    gnumake
    gnupg
    gperf
    libGLU
    m4
    ncurses5
    procps
    unzip
    utillinux
    xorg.libX11
    xorg.libXext
    xorg.libXi
    xorg.libXmu
    xorg.libXrandr
    xorg.libXv

    neovim
  ]
  ++ stdenv.lib.optionals stdenv.isDarwin [CoreServices ApplicationServices]
  ;

  nativeBuildInputs = [ curl gfortran m4 makeWrapper patchelf perl python2 which ];

  makeFlags =
    let
      arch = head (splitString "-" stdenv.system);
      march = { x86_64 = stdenv.hostPlatform.platform.gcc.arch or "x86-64"; i686 = "pentium4"; }.${arch}
              or (throw "unsupported architecture: ${arch}");
      # Julia requires Pentium 4 (SSE2) or better
      cpuTarget = { x86_64 = "x86-64"; i686 = "pentium4"; }.${arch}
                  or (throw "unsupported architecture: ${arch}");
    in [
      "ARCH=${arch}"
      "MARCH=${march}"
      "JULIA_CPU_TARGET=${cpuTarget}"
      "PREFIX=$(out)"
      "prefix=$(out)"
      "SHELL=${stdenv.shell}"

      "USE_SYSTEM_BLAS=1"
#      "USE_BLAS64=${if openblas.blas64 then "1" else "0"}"
      "USE_BLAS64=1"
      "LIBBLAS=-lopenblas"
      "LIBBLASNAME=libopenblas"

      "USE_SYSTEM_LAPACK=1"
      "LIBLAPACK=-lopenblas"
      "LIBLAPACKNAME=libopenblas"

      "PCRE_CONFIG=${pcre2.dev}/bin/pcre2-config"
      "PCRE_INCL_PATH=${pcre2.dev}/include/pcre2.h"
      "USE_SYSTEM_READLINE=1"
      "USE_SYSTEM_UTF8PROC=1"
      "USE_SYSTEM_ZLIB=1"

      "USE_BINARYBUILDER=0"
    ];

  LD_LIBRARY_PATH = makeLibraryPath [
    arpack fftw fftwSinglePrec gmp libgit2 mpfr openblas openlibm
    openspecfun pcre2
  ];

  enableParallelBuilding = true;

  doCheck = !stdenv.isDarwin;
  checkTarget = false;
  # Julia's tests require read/write access to $HOME
  preCheck = ''
    export HOME="$NIX_BUILD_TOP"
  '';

  preBuild = ''
    sed -e '/^install:/s@[^ ]*/doc/[^ ]*@@' -i Makefile
    sed -e '/[$](DESTDIR)[$](docdir)/d' -i Makefile
    export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}
  '';

  postInstall = ''
    # Symlink shared libraries from LD_LIBRARY_PATH into lib/julia,
    # as using a wrapper with LD_LIBRARY_PATH causes segmentation
    # faults when program returns an error:
    #   $ julia -e 'throw(Error())'
    find $(echo $LD_LIBRARY_PATH | sed 's|:| |g') -maxdepth 1 -name '*.${if stdenv.isDarwin then "dylib" else "so"}*' | while read lib; do
      if [[ ! -e $out/lib/julia/$(basename $lib) ]]; then
        ln -sv $lib $out/lib/julia/$(basename $lib)
      fi
    done
  '';

  shellHook = ''
    WORKING_DIR=$PWD
    echo $WORKING_DIR
    export EXTRA_LDFLAGS="-L/lib"
    export EXTRA_CCFLAGS="-I/usr/include"

    # Stuff for julia jupyter kernel
    # julia -e 'using Pkg; Pkg.add("IJulia")'

    # Python stuff for coc-vim
    rm -f env
    ln -s ${python-stuff}/bin env

    rm -f env_julia
    ln -s ${julia} env_julia

    # Set PYTHONPATH so that PyCall in julia finds the relevant packages
    export PYTHONPATH=${python-stuff}/lib/python3.7/site-packages/

    # Setup a local pip build directory
    alias pip="PIP_PREFIX='$(pwd)/_build/pip_packages' \pip"
    export PYTHONPATH="$(pwd)/_build/pip_packages/lib/python3.7/site-packages:$PYTHONPATH"
    unset SOURCE_DATE_EPOCH

    FIRST_RUN=false
    if [ "$FIRST_RUN" = true ]; then
      # Pip install python pacakges here
      :
    fi

    # cd $WORKING_DIR
    # Install jupyter extensions
    # Create required directory in case (optional)
    mkdir -p $(jupyter --data-dir)/nbextensions
    # Clone the repository
    cd $(jupyter --data-dir)/nbextensions
    git clone --depth 1 https://github.com/lambdalisue/jupyter-vim-binding vim_binding
    # Activate the extension
    jupyter nbextension enable vim_binding/vim_binding
    # Go back to the working dir
    cd $WORKING_DIR

    # manually setup the kernel
    # TODO figure out how to use jupyter-kernel.create
    KERNEL_DIR=~/.local/share/jupyter/kernels/julia
    mkdir -p $KERNEL_DIR
    ln -sf ${configFile} $KERNEL_DIR/kernel.json
    ln -sf ${IJulia}/deps/logo-32x32.png $KERNEL_DIR/logo-32x32.png
    ln -sf ${IJulia}/deps/logo-64x64.png $KERNEL_DIR/logo-64x64.png

    # Julia Threads
    export JULIA_NUM_THREADS=12

    # The Cmake binary fails, so we have to build it from source
    # julia -e 'ENV["CMAKE_JL_BUILD_FROM_SOURCE"] = 1'
    #export CMAKE_JL_BUILD_FROM_SOURCE=1

    # julia -e 'using Pkg; Pkg.activate("./"); Pkg.add("GR")'
    # Patch the GKS binary for GR
     patchelf \
     --set-interpreter ${glibc}/lib/ld-linux-x86-64.so.2 \
     --set-rpath "${libPath}" \
     /home/dkahlenberg/.julia/packages/GR/cRdXQ/deps/gr/bin/gksqt

    # Configure PyCall to pick up the correct python binary
    # julia -e 'ENV["PYTHON"]="${python-stuff}/bin/python"; using Pkg; Pkg.activate("./"); Pkg.build("PyCall")'

    # Everytime you get a new julia binary, we need to nuke the julia package cache...
    # Maybe?
    # rm -rf /home/dkahlenberg/.julia/
  '';

  passthru = {
    inherit majorVersion minorVersion maintenanceVersion;
    site = "share/julia/site/v${majorVersion}.${minorVersion}";
  };

  meta = {
    description = "High-level performance-oriented dynamical language for technical computing";
    homepage = https://julialang.org/;
    license = stdenv.lib.licenses.mit;
    maintainers = with stdenv.lib.maintainers; [ raskin rob garrison ];
    platforms = [ "i686-linux" "x86_64-linux" "x86_64-darwin" ];
    broken = stdenv.isi686;
  };
}

When running the built ./result/bin/julia directly and in the repl run using PyCall; @pyimport matplotlib then I get the ERROR: PyError (PyImport_ImportModule again thus I choose to run it not directly but as in nix-shell --pure -E "with import <nixpkgs> {}; callPackage ./. {}" --run "./result/bin/julia" -I nixpkgs=channel:nixos-20.03 and I can run using PyCall; @pyimport matplotlib no problems inside the repl then.

Good intel and progress here. Thanks for the shoutout @cstich, I see some resemblance to https://github.com/NixOS/nixpkgs/issues/70536#issuecomment-539146690 but I canā€™t take all the credit as I definitely based that on other comments and threads from quite a while ago.

Hereā€™s another one I have lying around in case thereā€™s something useful for someone: https://gist.github.com/tbenst/c8247a1abcf318d231c396dcdd1f5304

Joining the discussion very late, I currently have a simple (and dirty) Nix expression to download and patch a binary Julia release:

# vim: set ts=8 sw=2 sts=2 et:

{ pkgs ? import <nixpkgs> { } }:
with pkgs;
stdenv.mkDerivation rec {
  pname = "julia_bin";
  version = "1.5.2";

  src = fetchurl {
    url = "https://julialang-s3.julialang.org/bin/linux/x64/1.5/julia-${version}-linux-x86_64.tar.gz";
    # Use `nix-prefetch-url` to get the hash.
    sha256 = "0c26b11qy4csws6vvi27lsl0nmqszaf7lk1ya0jrg8zgvkx099vd";
  };

  nativeBuildInputs = [ autoPatchelfHook ];

  # Stripping the shared libraries breaks dynamic loading.
  dontStrip = true;

  installPhase = ''
    mkdir -p $out
    tar -x -C $out -f $src --strip-components 1
    # Lacks a string table, so we are unable to patch it.
    rm $out/lib/julia/libccalltest.so.debug
    # Patch for pre-compilation as the Nix store file time stamps are pinned to the start of the epoch.
    sed -i 's/\(ftime != trunc(ftime_req, digits=6)\)$/\1 \&\& ftime != 1.0/' $out/share/julia/base/loading.jl
    grep '&& ftime != 1.0$' $out/share/julia/base/loading.jl > /dev/null || exit 1
  '';

  meta = with stdenv.lib; {
    description =
      "High-level performance-oriented dynamical language for technical computing";
    homepage = "https://julialang.org";
    license = licenses.mit;
    platforms = platforms.linux;
  };
}

This works well for most use cases, even with the Julia package manager as it provides its own standalone libraries for most packages these days. However, it falls flat on its face for something like FFMPEG.jl that lifts in its own binaries.

Getting the package manager to work even for these cases should be as simple as patching Pkg.jlā€™s Artifacts.jl here along with pretty much the same lines as rustup does with its Nix expression and a patch to dynamically use patchelf to swap out the interpreter for the dynamic one provided by Nix.

However, this would require rebuilding the Julia system image, which is where I got stuck and I will have to abandon this project until I am less bogged down at work ā€“ weeks, probablyā€¦ What I fear is that this may require building Julia from source to get that system image, but I was unaware that @cstich had an expression for a more modern Julia version than the somewhat ancient 1.3 version that is in nixpkgs. So perhaps my hour of digging today can lead to someone else getting it working more swiftly?

Personally, while I would be happy to have support for Julia with packages in Nix, I honestly do not mind that much having only the (patched) Julia binary managed by Nix as the Julia package ecosystem these days is more than nice enough to manage dependencies on a project by project basis without having to resort to nix-shell.

2 Likes

I do actually have a somewhat working version of julia 1.5.2 compiled from source locally (it complains about various tests failing, so I just skip them for now). I should probably finally open a pull request for that at nixpkgs and I guess people (@ninjin) can then take that and patch the source file accordingly?

2 Likes

Sounds good, just link the PR here and I will be happy to contribute. Ignoring the build times and an unforeseen bump, it should not take long to produce a working patch.

1 Like

@ninjin FYI, I just submitted a pull request for 1.5.2

Edit: Confusingly, there also seems to be work going on in a separate pull request.

3 Likes

Hi allā€“Iā€™ve been working on a tool for creating Julia environments in Nix now that the new and improved Pkg3 and Pkg.Artifacts systems exist. Itā€™s brand new and Iā€™ve only tried it with a small number of packages, but it seems to work. Iā€™d love any feedback or bug reports. You can find it here: https://github.com/thomasjm/julia2nix. (I also posted this on the main GitHub issue.)

4 Likes

Now that Julia 1.5.3 has landed in NixOS (Great work guys!) I have been trying to pull together a comprehensive NixOS Julia shell.nix script, a bit like the one earlier in this thread, that includes NixOS libraries and binaries for common Julia packages.

I have started a GitHub repo.

It works for many things, but Gtk is still a particular problem. Does anybody have any contributions or something better please?

1 Like

@robblackwell Thank you for the link!

In my case rather than a shell.nix script, I have decided to add my own changes directly to a julia.nix derivation. Each time you add a new dependence and rebuild it will recompile julia that is not the best scenario but for me it is fine. For example, eventually, rather than use PyCall when it doesnā€™t find netcdf, I have added netcdf dependency directly in the derivation and PyCall is not used for that.

Hi @robblackwellā€“my understanding is that the new Julia Artifacts.toml system will make it unnecessary to manually deal with dependencies like this, because packages will be able to explicitly specify their native dependencies in a per-platform, reproducible way.

For example, IJulia already works fine with my julia2nix package because its downstream dependencies use the artifacts system. (I donā€™t know off the top of my head if Gtk works yet.)

I think the best way to push the Julia/NixOS ecosystem forward is to open PRs against Julia packages that donā€™t use artifacts yet and get them on the new system, and then nobody will need to maintain an external mapping of dependencies.

4 Likes

Thanks @thomasjm, very interesting. So artifact binaries and libraries from JuliaBinaryWrappers should work on NixOS unmodified? Iā€™m still trying to wrap my head around this and understand the future Julia on Nix story.

Itā€™s not easy for me to use julia2nix yet because my NixOS system is on 20.09 (stable) and julia_15 is in unstable. Is there an easy way to tell julia2nix which channel to use please?

If you click on them and see an Artifacts.toml file in the repo, then yes they should work. It seems that most/all of them do. (Iā€™m not claiming my system is the definitive future Julia on Nix story by any means, but given that theyā€™ve built this nice Nix-inspired artifacts system on the Julia side, it seems natural to leverage it on the Nix side, so Iā€™d expect any eventual solution to use the same approach.)

I donā€™t use NixOS but I think if you add the unstable channel to your system following the normal Nix documentation then things will work. julia2nix depends on julia_15 via a single nix-shell -i shebang in one of the scripts. The exact version isnā€™t crucial, as long as itā€™s new enough to understand artifacts. Iā€™m not positive how nix-shell searches for packages, itā€™s possible youā€™ll need to tweak the NIX_PATH also.

1 Like

Does anyone have a working 1.6 expression? I tried just bumping 1.5.3 but got some errors.

{ stdenv, fetchurl, fetchzip, fetchFromGitHub
# build tools
, gfortran, m4, makeWrapper, patchelf, perl, which, python2
, cmake
# libjulia dependencies
, libunwind, readline, utf8proc, zlib
# standard library dependencies
, curl, fftwSinglePrec, fftw, libgit2, mpfr, openlibm, openspecfun, pcre2
# linear algebra
, blas, lapack, arpack
# Darwin frameworks
, CoreServices, ApplicationServices
}:

assert (!blas.isILP64) && (!lapack.isILP64);

with stdenv.lib;

let
  majorVersion = "1";
  minorVersion = "6";
  maintenanceVersion = "0-beta1";
  src_sha256 = sha256:0v3fw5rqx09ivg21jhwxi2xf7ys4qlzyr1ciys3hj8nwxcghn3v8;
  version = "${majorVersion}.${minorVersion}.${maintenanceVersion}";
in

stdenv.mkDerivation rec {
  pname = "julia";
  inherit version;

   src = fetchzip {
     url = "https://github.com/JuliaLang/julia/releases/download/v${version}/julia-${version}-full.tar.gz";
     sha256 = src_sha256;
   };

  patches = [
    ./use-system-utf8proc-julia-1.3.patch

    # Julia recompiles a precompiled file if the mtime stored *in* the
    # .ji file differs from the mtime of the .ji file.  This
    # doesn't work in Nix because Nix changes the mtime of files in
    # the Nix store to 1. So patch Julia to accept mtimes of 1.
    ./allow_nix_mtime.patch
  ];

  postPatch = ''
    patchShebangs . contrib
    for i in backtrace cmdlineargs; do
      mv test/$i.jl{,.off}
      touch test/$i.jl
    done
    rm stdlib/Sockets/test/runtests.jl && touch stdlib/Sockets/test/runtests.jl
    rm stdlib/Distributed/test/runtests.jl && touch stdlib/Distributed/test/runtests.jl
    # LibGit2 fails with a weird error, so we skip it as well now
    rm stdlib/LibGit2/test/runtests.jl && touch stdlib/LibGit2/test/runtests.jl
    sed -e 's/Invalid Content-Type:/invalid Content-Type:/g' -i ./stdlib/LibGit2/test/libgit2.jl
    sed -e 's/Failed to resolve /failed to resolve /g' -i ./stdlib/LibGit2/test/libgit2.jl
  '';

  dontUseCmakeConfigure = true;

  enableParallelBuilding = true;

  buildInputs = [
    arpack fftw fftwSinglePrec libgit2 libunwind mpfr
    pcre2.dev blas lapack openlibm openspecfun readline utf8proc
    zlib
  ] ++ stdenv.lib.optionals stdenv.isDarwin [CoreServices ApplicationServices];

  nativeBuildInputs = [ curl gfortran m4 makeWrapper patchelf perl python2 which cmake ];

  makeFlags =
    let
      arch = head (splitString "-" stdenv.system);
      march = {
        x86_64 = stdenv.hostPlatform.platform.gcc.arch or "x86-64";
        i686 = "pentium4";
        aarch64 = "armv8-a";
      }.${arch}
              or (throw "unsupported architecture: ${arch}");
      # Julia requires Pentium 4 (SSE2) or better
      cpuTarget = { x86_64 = "x86-64"; i686 = "pentium4"; aarch64 = "generic"; }.${arch}
                  or (throw "unsupported architecture: ${arch}");
    # Julia applies a lot of patches to its dependencies, so for now do not use the system LLVM
    # https://github.com/JuliaLang/julia/tree/master/deps/patches
    in [
      "ARCH=${arch}"
      "MARCH=${march}"
      "JULIA_CPU_TARGET=${cpuTarget}"
      "PREFIX=$(out)"
      "prefix=$(out)"
      "SHELL=${stdenv.shell}"

      "USE_SYSTEM_BLAS=1"
      "USE_BLAS64=${if blas.isILP64 then "1" else "0"}"

      "USE_SYSTEM_LAPACK=1"

      "USE_SYSTEM_ARPACK=1"
      "USE_SYSTEM_FFTW=1"
      "USE_SYSTEM_GMP=0"
      "USE_SYSTEM_LIBGIT2=1"
      "USE_SYSTEM_LIBUNWIND=1"

      "USE_SYSTEM_MPFR=1"
      "USE_SYSTEM_OPENLIBM=1"
      "USE_SYSTEM_OPENSPECFUN=1"
      "USE_SYSTEM_PATCHELF=1"
      "USE_SYSTEM_PCRE=1"
      "PCRE_CONFIG=${pcre2.dev}/bin/pcre2-config"
      "PCRE_INCL_PATH=${pcre2.dev}/include/pcre2.h"
      "USE_SYSTEM_READLINE=1"
      "USE_SYSTEM_UTF8PROC=1"
      "USE_SYSTEM_ZLIB=1"

      "USE_BINARYBUILDER=0"
    ];

  LD_LIBRARY_PATH = makeLibraryPath [
    arpack fftw fftwSinglePrec libgit2 mpfr blas openlibm
    openspecfun pcre2 lapack
  ];

  # Julia's tests require read/write access to $HOME
  preCheck = ''
    export HOME="$NIX_BUILD_TOP"
  '';

  preBuild = ''
    sed -e '/^install:/s@[^ ]*/doc/[^ ]*@@' -i Makefile
    sed -e '/[$](DESTDIR)[$](docdir)/d' -i Makefile
    export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}
  '';

  postInstall = ''
    # Symlink shared libraries from LD_LIBRARY_PATH into lib/julia,
    # as using a wrapper with LD_LIBRARY_PATH causes segmentation
    # faults when program returns an error:
    #   $ julia -e 'throw(Error())'
    find $(echo $LD_LIBRARY_PATH | sed 's|:| |g') -maxdepth 1 -name '*.${if stdenv.isDarwin then "dylib" else "so"}*' | while read lib; do
      if [[ ! -e $out/lib/julia/$(basename $lib) ]]; then
        ln -sv $lib $out/lib/julia/$(basename $lib)
      fi
    done
  '';

  passthru = {
    inherit majorVersion minorVersion maintenanceVersion;
    site = "share/julia/site/v${majorVersion}.${minorVersion}";
  };

  meta = {
    description = "High-level performance-oriented dynamical language for technical computing";
    homepage = "https://julialang.org/";
    license = stdenv.lib.licenses.mit;
    maintainers = with stdenv.lib.maintainers; [ raskin rob garrison ];
    platforms = [ "i686-linux" "x86_64-linux" "x86_64-darwin" "aarch64-linux" ];
    broken = stdenv.isi686;
  };
}
    LINK usr/lib/libjulia-internal.so.1.6
Warning: git information unavailable; versioning information limited
    JULIA usr/lib/julia/corecompiler.ji
ERROR: Unable to load dependent library /build/source/usr/bin/../lib/libopenlibm.so
Message:/build/source/usr/bin/../lib/libopenlibm.so: cannot open shared object file: No such file or directory
make[1]: *** [sysimage.mk:61: /build/source/usr/lib/julia/corecompiler.ji] Error 1
make: *** [Makefile:82: julia-sysimg-ji] Error 2
builder for '/nix/store/hgyd2ra9w4sngj4p7siyxhh8l89f5dj3-julia-1.6.0-beta1.drv' failed with exit code 2
error: build of '/nix/store/hgyd2ra9w4sngj4p7siyxhh8l89f5dj3-julia-1.6.0-beta1.drv' failed
1 Like