I have created custom cache for pipeline.. where it exceeds the memory limit to .34 GB (total cache was 1.34 GB) which is not allowed in the pipeline. Here cache limit is 1 GB, if the size of cache is exceed it'll skip and does not create any cache.
image: python:3.6.2
pipelines:
default:
- step:
caches:
- condacache
script:
- /opt/python/bin/conda update -y conda
- /opt/python/bin/pip install --upgrade pip
- /opt/python/bin/pip install -r requirements.txt
- /opt/python/bin/pytest
definitions:
caches:
condacache: /opt/python
Here is my Script from pipeline please look and give suggestion how can i reduce my cache size and if i could ignore any script to save in cache.
Hi Rajat,
Can you try replacing you caches section with the following?
definitions:
caches:
- pip
This will use the default caching directory pip modules. As noted here: https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html
By caching /opt/python, you're also caching all the binary files. Which is contributing to your large cache, and doesn't need to be cached as it's included in the python image.
Let me know if that works.
Thanks,
Phil
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.