When running a load test for Bitbucket with very heavy load, the caches in the local home directory are filled.
This happens even when running with the default settings for the cache eviction (more on this on the bitbucket.properties - SCM cache article.
Is this expected?
In very special conditions (including very high load testing), there are two main reasons that could lead to this behaviour.
Disk size is checked at cache creation
Bitbucket checks for available space when starting to create the cache and not when populating it. This means that if the space is available when a repository is cloned for the first time, leading to the creation of the cache, the streaming of the data to the cache will start.
After this, there is no additional check if the disk space remaining is still above the threshold while writing the data.
When cloning a new repository, the check will be performed again before the writing process starts.
If there is a significant amount of clone requests being performed at the same time (note that for this to happen, the timeframe would be need to be very very short), it could be that there is enough space before the caches are written but this fills up very quickly due to the concurrent clone requests.
Caches are not cleared if there is a process reading from them
If there are processes reading from the caches (e.g. a clone in progress), the cache will not be cleared immediately. Bitbucket will wait until the read is complete to trigger the clean up.
To recap, what happens during the high load test is a combination of these two factors:
As a suggestion, always make sure that the load test is performed using a simulated load close to the one in production.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.