Fixed: limit on number of packages + add rworkflows#67
Fixed: limit on number of packages + add rworkflows#67bschilder wants to merge 3 commits intor-hub:masterfrom
rworkflows#67Conversation
|
Site is back up now. |
|
@gaborcsardi have you (or any other maintainers of Thanks, |
|
Thanks for the PR, but I am not going to merge this, sorry. I prefer using our workflows instead of rworkflows. As for the parallelization, I prefer not to use parallel for HTTP. |
Ok, I suppose that's up to you, but would you mind providing some justifications for the benefit of users who requested some of these features? eg #56
It seems your workflows has been failing for several years now, unless I'm missing something. If you were to fix this i think it would be a decent alternative.
Is there a technical reason for this? I'd be curious to know myself! Similarly, is there an alternative implementation that would suit this purpose better? If you do not wish to support parallelization at all, you have some other options available to you (in order of preference):
Happy to adjust the PR as needed, just let me know what you'd like to accomplish with this package. |
I prefer using
Yes, several. Using multiple processes for concurrent HTTP is an anti-pattern. However, in general, I am not sure if I want to provide a quick way for people to "rank" all CRAN packages according to downloads. I don't really want people to think that this is an indication of package quality. |
Haven't come across this one! I'll check it out though. As you may already be aware, there's also Even so, if you're no longer using Travis, perhaps it's time to retire it from here?
This is great information to have! Thanks for the explanation.
Hm, I hadn't thought of it that way. I personally don't agree with this reasoning as I think it's best to give the users the complete data and make decisions about how to responsibly use it themselves (perhaps with a disclaimer in the documentation). For example, in my case I'm using the downloads to assess package usage, but I wouldn't think to use this as a metric of quality (not sure how many other users would?). Would you mind updating this thread here to let the other users know of your decision either way? Thanks! #56 |


cran_downloadsand prevent it from failing with many packages:limit on number of packages as argument to cran_downloads #56
cran_downloadswith parallelisation.cran_downloadswith lots of packages.dontrun{}fromcran_downloadsandcran_top_downloadsexamples.Unclear why this was here?
message_parallel,split_batchesrworkflows.All you need to do on your end @gaborcsardi is add GH token as a GH secret named "PAT_GITHUB" to the
cranlogsrepo.Best,
Brian