I recently rewrote and force-pushed the history of my Bitbucket repository to remove large binary blobs (BFG + git gc), but the “Repository size” shown in the Bitbucket UI still reports the old (large) value. A fresh clone from Bitbucket confirms that all big files are gone, so Bitbucket’s server-side garbage-collection simply hasn’t yet reclaimed the storage.
Can you please trigger a manual GC so that the displayed size matches the actual cleaned repository? Thank you.
Hi @HB,
Welcome to the Bitbucket Cloud community!
I've run garbage collection on the repository on your workspace, and the repository size has significantly reduced. This is also reflected in the overall workspace size.
Before proceeding with the next push to the remote repository, please ensure that you and all repository collaborators make a fresh clone of the remote repository to avoid reintroducing old/cleaned-up history and potentially inflating the repository size again.
- Phil
Welcome to the community,
Garbage collection (gc) runs automatically with each push, but may not always clean up your repository immediately. Someone from Atlassian will have to make a immediatley push wehen they see your question or you make a ticket at atlassian support , when you have a paid plan. f.
BR
Kai
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Kai, thank you for your answer.
I know that repository clean-up does not run immediately. I understand that it might take 24 hs, but it's been 48 hs and my repository size shown in Bitbucket still reports the old value very close to the size limit. I hope someone from Atlassian runs the garbage collection.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.