-
Notifications
You must be signed in to change notification settings - Fork 152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow renv::restore to help updating packages when restore fails #1893
Comments
Sounds like you want |
I'm not entirely sure if this is the same thing that @torbjorn is commenting on, but I can confirm that upgrading 30 or so renv controlled repos from r4.1 to r4.2, and then a year later to r4.3 did require a lot of manual effort. This was mainly down to changes in Ideally we'd have keep the same package versions when upgrading r version - we only find out which packages have to be upgraded by renv restore trial and error.. |
At this point, wiht 10's of projects at hand, I'll likely write a try-catch function to do the above. |
I'm sourcing a scripts that does this for the moment to handle this: while(TRUE) {
t <- try(renv::restore(prompt=FALSE, clean=FALSE))
if(!inherits(t, "try-error")) {
break
}
txt <- as.character(t)
r <- gregexec("install of package '(.*?)' failed", txt, perl=TRUE)
if( any(r[[1]] == -1) ) {
break
}
pkg <- regmatches(txt,r)[[1]][2,1]
renv::update(prompt=FALSE, packages=pkg)
renv::record(pkg)
}
|
echoing that this has been a problem with prior R updates for me as well, and I've run into exactly the same challenges as @torbjorn describes with the change to 4.4 |
I might be misunderstanding something, but -- why do you want to update R, without also updating the packages used by your project? Quite often, older versions of R packages will fail to build / compile against newer versions of R. Or, other runtime incompatibilities can arise when using older versions of R packages with newer versions of R. These are, more-or-less, outside of Since CRAN uses a rolling release model, your best shot at a good / compatible state is to take the latest-available packages at some particular point in time. Occasionally, you may need to break these rules, but I think that's the baseline that should normally be used. Mixing different versions of packages from different release times without careful thought is dangerous. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Not speaking for anyone else... but in our case we're really only looking to minimise time and risk during upgrades - but maybe we've been kidding ourselves! Longer answer: We've been trying to have one main version of R in use across all our apps and reports and packages, and we've been quite forwards about upgrading R... this is because (so far) R itself hasn't given our apps too many breaking changes - e.g. the only thing I can remember causing pain were a few cases caused by the However, we've been less keen on upgrading packages as there are many, many more of them and because upgrading them has given us quite a few problems which have taken time to find and to fix. These problems have especially been seen around changes of API surface - e.g. in tidyverse packages as they moved to 1.0 status. When the API surface has changed we first had to identify the change (in practice by running tests, scripts and apps and hoping that the change caused an error rather than something more subtle); then had to find all the places in our scripts and packages that were affected; then had to manuall go through and change and retest all the bits of our R code affected. When we've done this, this has taken quite a long time to do... and it has risked us missing changes - which would then cause failures at some later date when some specific report or app code runs... <- so that's why we've tried to avoid mass package upgrades (so far).
It probably is correct that the 'right' way to do an R upgrade is to wipe the current renv.lock file, to hydrate a new set of packages and to then do a heck of a lot of testing... but to date we've mainly been "winging it" by upgrading R, installing what packages we can from Maybe the underlying story here is that:
I'm trying in my mind to also compare this to how we think about other projects - e.g. our C# projects when we upgrade .Net versions. However, I think those compiled languages have a slight advantage here as most of the API level changes will throw up compile time errors... Lots to consider here... sorry this is a bit rambling... |
@slodge beat me to this, but we have the exact same experience.
For deployment and CI/CD pipelines we typically run R incontainers, its less of a problem there. The problem is more that hands on work and analysis in projects typically is done using locally installed RStudio and R (installed on the workstation being used). For the time being we are happy (at least after we scripted the approach) try-catching renv::restore and updating one package at the time, thus minimizing the number of packages that get updated. |
After updating to R 4.4 I just spent 20 minutes repeating the following procedure to restore an old project:
Could we have
restore(and_try_updating_to_make_it_work=TRUE)
handle this itself? (With a better argument name obviously)There is no need for a human to be at the helm to do the above routine repetatively for a period of time. Of course there are perfectly valid reasons why more or less blindly updating arbitrary packages isn't a good idea, but I imagine, for probably a large majority of projects, package versions that are no longer installable can be safely updated as the prefered solution instead of troubleshooting installation of old versions.
Halfway through I tried
renv::update()
to just update all packages, (and risking rebuilding packages not in cache), but this didnt fix things, still had to go back to update individual packages and repeat. This is also not an ideal solution as you now update everything and not just the ones that are causing a problem.The text was updated successfully, but these errors were encountered: