You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a very large repo which takes 1-2 minutes to do an unshallow fetch
Wed, 09 Aug 2023 16:31:34 GMT Run tony84727/[email protected]
Wed, 09 Aug 2023 16:31:34 GMT /usr/bin/git fetch --prune --unshallow
Wed, 09 Aug 2023 16:32:33 GMT From https://github.com/<org>/<repo>
I was wondering if there might be a better way to do this check? Maybe checking out the head and the base refs and doing the compare that way?
I'm using this action in many of my workflows to conditionally skip long running jobs depending on if you changed relevant files, so the time spent is still worth it, but it would be nice if it was faster.
The text was updated successfully, but these errors were encountered:
I occasionally get extremely slow performance on my repo, around 3 minutes, and sometimes even up to 10 minutes to run. But most of the time it runs in 40 seconds.
I just noticed you merged a PR in attempts to fix this. I think our repo's problem is the shear number of tags and branches it has. It doesn't seem like the change you made to add --filter=blob:none changed the performance for us.
Hi,
I have a very large repo which takes 1-2 minutes to do an unshallow fetch
I was wondering if there might be a better way to do this check? Maybe checking out the head and the base refs and doing the compare that way?
I'm using this action in many of my workflows to conditionally skip long running jobs depending on if you changed relevant files, so the time spent is still worth it, but it would be nice if it was faster.
The text was updated successfully, but these errors were encountered: