Unexpected end of data when updating registries with LocalPackageServer

I’m also seeing errors updating registries today:

(@v1.10) pkg> st
  Installing known registries into `/usr/local/julia/depot`

ERRORS:
Unexpected end of archive

ERROR: Unexpected end of data : jl_E7Sdw3SLa9~
┌ Warning: unable to decompress and read archive
│   exception = EOFError: read end of file
└ @ Pkg.PlatformEngines /usr/local/julia/julia-1.10.2/share/julia/stdlib/v1.10/Pkg/src/PlatformEngines.jl:670

In particular, us-west and us-east servers are providing different responses. For example:

julia> a = HTTP.get("https://us-west.pkg.julialang.org/registry/23338594-aafe-5451-b93e-139f81909106/69c29d65afe36e25cc466dae1000e0b8b952ee2c")
HTTP.Messages.Response:
"""
HTTP/1.1 200 OK
Server: nginx/1.23.3
Date: Tue, 09 Apr 2024 20:22:15 GMT
Content-Type: application/octet-stream      
Content-Length: 7444340
Connection: keep-alive
Last-Modified: Tue, 09 Apr 2024 20:22:11 GMT
ETag: "6615a373-719774"
Accept-Ranges: bytes
X-lb-strategy: pkgservers_hashed


⋮
7444340-byte body
"""

julia> b = HTTP.get("https://us-east.pkg.julialang.org/registry/23338594-aafe-5451-b93e-139f81909106/69c29d65afe36e25cc466dae1000e0b8b952ee2c")
HTTP.Messages.Response:
"""
HTTP/1.1 200 OK
Date: Tue, 09 Apr 2024 20:22:34 GMT
Content-Type: binary/octet-stream
Content-Length: 7444340
Connection: keep-alive
ETag: "b4e82205689f155b67c60f15e867a196"
Last-Modified: Tue, 09 Apr 2024 20:06:26 GMT
Vary: Accept-Encoding
Cache-Control: max-age=14400
CF-Cache-Status: MISS
Accept-Ranges: bytes
Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v4?s=6VXQoMugMqXKounLZktdncu02BGZSOajvMiVlAWoE0Nw6fT3I9E9fto91XjojPuAj8T%2B2OqsXRZ3cRyOPayjtZm%2BRxnWM%2BaPuJ6qBiE58dAVS0HNTh%2BwjYmQgzZjNUI9bugeBPzg1gk%3D"}],"group":"cf-nel","max_age":604800}
NEL: {"success_fraction":0,"report_to":"cf-nel","max_age":604800}
Server: cloudflare
CF-RAY: 871d35be8bda0903-SEA
alt-svc: h3=":443"; ma=86400


⋮
7444340-byte body
"""

And when I inspect the message body, I find the following html appended to the us-east contents:

<html>
<head><title>301 Moved Permanently</title></head>
<body>
<center><h1>301 Moved Permanently</h1></center>
<hr><center>nginx/1.23.3</center>
</body>
</html>

Is this an issue with the storage server?

That looks like a different issue probably related to the initial rollout of Changes to PkgServers storage backend (call for beta testers). I can not reproduce it though:

$ curl -fsSL https://us-west.pkg.julialang.org/registry/23338594-aafe-5451-b93e-139f81909106/69c29d65afe36e25cc466dae1000e0b8b952ee2c | sha256sum
0b3e798df49a80e4068c5b65013c7b23316afe05e529003a4e1a0c3f56fd50b7  -

$ curl -fsSL https://us-east.pkg.julialang.org/registry/23338594-aafe-5451-b93e-139f81909106/69c29d65afe36e25cc466dae1000e0b8b952ee2c | sha256sum
0b3e798df49a80e4068c5b65013c7b23316afe05e529003a4e1a0c3f56fd50b7  -

How did you obtain this body? You seem to have the same number of bytes at least:

Please see code snippet below. We have a LocalPackageServer that stopped working today and it is located in eastern US. I have teammates that spun up another one in western US that appears to be working so I suspected there’s something different between us-west and us-east…

julia> open("us-west-response", "w") do io
       HTTP.get("https://us-west.pkg.julialang.org/registry/23338594-aafe-5451-b93e-139f81909106/69c29d65afe36e25cc466dae1000e0b8b952ee2c", response_stream=io)
       end
HTTP.Messages.Response:
"""
HTTP/1.1 200 OK
Server: nginx/1.23.3
Date: Tue, 09 Apr 2024 23:01:35 GMT
Content-Type: application/octet-stream
Content-Length: 7444340
Connection: keep-alive
Last-Modified: Tue, 09 Apr 2024 22:40:00 GMT
ETag: "6615c3c0-719774"
Accept-Ranges: bytes
X-lb-strategy: pkgservers_hashed

[Message Body was streamed]"""

julia> open("us-east-response", "w") do io
       HTTP.get("https://us-east.pkg.julialang.org/registry/23338594-aafe-5451-b93e-139f81909106/69c29d65afe36e25cc466dae1000e0b8b952ee2c", response_stream=io)
       end
HTTP.Messages.Response:
"""
HTTP/1.1 200 OK
Date: Tue, 09 Apr 2024 23:02:01 GMT
Content-Type: binary/octet-stream
Content-Length: 7444340
Connection: keep-alive
ETag: "b4e82205689f155b67c60f15e867a196"
Last-Modified: Tue, 09 Apr 2024 20:06:26 GMT
Vary: Accept-Encoding
Cache-Control: max-age=14400
CF-Cache-Status: HIT
Age: 2253
Accept-Ranges: bytes
Report-To: {"endpoints":[{"url":"https:\/\/a.nel.cloudflare.com\/report\/v4?s=PAXlM6fCjHYxbYL38geXPOeCVhlh3ty257YjhZCA%2BYFcQO%2Be8R3U9d93bJRVuiWXoGQywYZSXQqeKWOyjK21sorDCLt3P7%2FGTUYZnxrwDNi4ZMSFyrU6GCWzJKXMJxswX%2B6J5Kv6Erk%3D"}],"group":"cf-nel","max_age":604800}
NEL: {"success_fraction":0,"report_to":"cf-nel","max_age":604800}
Server: cloudflare
CF-RAY: 871e1f525cf330e1-SEA
alt-svc: h3=":443"; ma=86400

[Message Body was streamed]"""

julia> v_west = readlines("us-west-response");

julia> v_east = readlines("us-east-response");

julia> v_west[1]
"\x1f\x8b\b\0\0\0\0\0\x02\x03\xec\xbd\xd9z\eG\xb6.X\xd7|\x8a,\xd8\xdf\x11i+\xc9\x18s\x90[U\x96%\x96\xad*M\x9f\x06\u5f7K֡cJ\x11\x16\b\xf0\0\xa0dyW\x9d\xc7\xe8۾\xe9w\xe8\xfb~\xa2~\x84^+2
\x13\xc8\t\xe0L\x91RrﲀDḏֿV\xfc+b\xdb\f\xfft\xd9\x7f\x84\x90X\xca\0\xff\x85\xbf\xe6\xbf\xf9\
x97\xfagH\x1e\xff)\x90\x7f\xba\x82\xbf\xa3\xd9\\M\xa1*\x17\xd0\xc8Z\xe3n\xc8߶\x19\xee<V\xe3a\xe6f\xf3m\xbaM\xc9\xf6|r0\xba\xf0\xf1\x8f\x84X5\xfe\x9c\x11\x11\xd7ǟ\xb2(\x92\x7f"

julia> length(v_west)
24731

julia> v_east[1]
"<html>"

julia> length(v_east)
24738

Okay, that information would have been nice to have up front the next time :slight_smile:

This looks like a bug in HTTP.jl then – the response body of a 301 redirect should not be written to the response stream. I did a quick bisect and this bug seems to be introduced between v1.5.5 and 1.6.3. Pinning HTTP.jl to 1.5 should fix your issue for now.

Edit: Opened HTTP.jl#1165.

4 Likes

Thanks so much for the quick response! I pinned HTTP to 1.5 and that indeed fixed the issue. Thanks!

2 Likes