Hmm… I understand the implication of all these outstanding PRs, but it would be brilliant if the devs could post a beginner friendly upgrade path, if at all possible at this stage.
The package ecosystem is in a temporary state of flux while developers adapt to changes planned for Julia 1.0 and the new missing type implemented in DataFrames. Things should get better soon. In the meantime, I agree its frustrating. My guess is that there is not a general/simple solution for upgrading DataFrames at the moment.
Actually thanks to recent fixes in DataFrames 0.11.4, RDatasets will work again soon with that DataFrames version (albeit with deprecation warnings). See https://github.com/JuliaLang/METADATA.jl/pull/12808.
Thanks to everybody who worked towards this solution. Gadfly, Vega, ECharts (my package, currently unregistered) and many others will be able to move foward!
This fix just got merged into METADATA, which now allows all the packages I care about (i.e. maintain) to move forward with RDatasets/DataFrames 0.11. Thanks everyone!
I just spent around 20 minutes figuring this out (again, was somehow silently downgraded when I was not paying attention), removing some packages that depended on Gadly did the trick. Eg do a
grep Gadly ~/.julia/v0.6/**/REQUIRE
or similar to find them.
The fact that a single (albeit important) package can lead to such intricate dependency problems is somewhat disconcerting. I it wasn’t for Pkg3, I would be concerned.
I think part of the problem was the side-edit to METADATA to cap all packages using DataFrames to 0.10, rather than let people experience broken packages. The effect of this, of course, was to make it really confusing as to why downgrades kept occurring.
I disagree. The trend of saying that you support arbitrary future versions of a package is just nonsense. Getting a clear error that a package is not supported is much better than getting run time errors or possibly silently getting wrong results.
Pkg3 will make it much easier to create multiple environments but it cannot help with trying to simultaneously use packages with incompatible dependencies. The only solution to that (ever) will just be to upgrade the packages that are incompatible.
You’re reading more into my comment than I was saying. My comment is around the editing of METADATA to cap all of the packages. As a user, I can always go to GitHub and look at what REQUIRE says…except in the case of the DataFrames change.
Any package that depended on DataFrames became out-of-sync in the REQUIRES file. Only if you were saavy enough to read METADATA would you be able to understand why a package was forcing a downgrade (rather than be able to look at the source code)
It’s a decision that was made, and so be it. I’m just saying that it caused me some confusion until I understand how all the packages were capped.
Sorry for reading too much. I totally agree that it is (very) confusing when the REQUIRE file in the package and METADATA are out of sync.
In fact, the reason this happens in the first place is that people play loose with their upper bounds, and when a breaking change happens, the upper bounds have to be “backported”. I think in the future, we need to be more careful with this to prevent this situation.
What’s the best practice here? Should packages always cap their dependancies at the next (potentially nonexistent) semver that indicates a breaking change? When that next version comes out, the version check can always be relaxed later if everything is ok, correct? Asking for a friend.
I’m not sure it’s fair to say that people “play loose” with upper bounds.
It’s simply unclear which upper bounds to put when you release a package. You will either have to revise them up in the future if a new compatible release comes out, or upper-bound them when the new release breaks compatibility. Both of these are unpleasant.
I don’t see an easy solution to this – maybe using the automated JuliaCI testing infrastructure to set such bounds more automatically?