Hello, our cloud repo ( https://bitbucket.org/warrantygroup/sfdc-service-cloud/) is showing that we are at 1.8GB but locally when I run the command
git count-objects -v
Locally is showing size-pack: 422917
Also a couple of hours ago we were in 450MB. How can this grow so fast? Can you please run the same that was ran in question: https://community.atlassian.com/t5/Bitbucket-questions/Why-Bitbucket-cloud-is-showing-a-different-size-than-my-local/qaq-p/716727#M23204
We need this ASAP!
Also I would like if someone can answer the question of why this is happening.
Thank you.
Regards,
Fricco.
Hi Fricco, I've checked your repo and the size of the remote was 723.5 MB, I've run a git gc on it and the size has decreased to 379.2 MB
This is how it happens: suppose you create a branch, change a bunch of files, commit your changes, do that again, and then delete the branch without merging it. All of those changed files and commits are sitting around taking up space, but you’re probably never going to use them again.
Git gc runs a number of housekeeping tasks within the current repository, such as compressing file revisions (to reduce disk space and increase performance) and removing unreachable or unnecessary objects. There are subtleties, but that's how it works overall.
Hope that helps!
Ana
Thanks @Ana Retamal,you rock!
The explanation makes sense, don't know why it was spiking up to 1.8GB and then when you reviewed went to 723MB, but glad you still run it and reduced it even more.
How often does this housekeeping process run? We have several teams working in the same repo and working in different time zones so it is possible that some of the more than 50 developers do what you just explained, but it sounds that this can basically screw everybody else until the housekeeping scripts run. Is it possible to have it run something like everyday or so? Is there any impact if that is done?
Also, can the 2GB limitation be increased?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Git gc is run automatically when certain triggers are met, it can take longer sometimes and in those cases you can contact us to run it manually for you.
As per the 2GB limitation, it can not be increased. If you're storing large files or binary files, we'd recommend enabling LFS (Large File Storage). You can learn more about it at Git large file storage in Bitbucket
For more info you can also check Bitbucket pricing.
Let us know if you have any other questions!
Kind regards,
Ana
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.