To test my repo, I need to run:
```
Hi Pasquale and welcome to the community!
Docker compose is available in the image atlassian/default-image:4 as docker-compose. Please keep in mind that Docker compose v2 (which is preinstalled in atlassian/default-image:4) won't work with buildkit. Docker compose with buildkit builder by default requires privileged mode, which is disabled in Bitbucket Pipelines for security reasons. If you want to use Docker compose v2, you will need to disable buildkit by adding the command export DOCKER_BUILDKIT=0 in your yml file, in the Pipelines step that uses docker-compose. Restricted Docker commands and options are listed in this page:
Installing Docker Engine during your build won't work. Every pipelines step runs in a Docker container, and unless you use the docker service that Pipelines provide your build won't have access to a Docker daemon.
You can build your own Docker image that includes both python:3.11 and docker-compose, and use that as a build container in your Pipelines builds. You can install Docker Compose standalone (in images that don't include it) by following the instructions on this page:
You can use as base image python:3.11, or you could use a different one, e.g. alpine, ubuntu, debian and then install Python 3.11, docker-compose and any other tools you need by adding the necessary commands in the Dockerfile. You can then build and push this image to a Docker registry and reference it in your yml file. We support public and private Docker images including those hosted on Docker Hub, AWS, GCP, Azure and self-hosted registries accessible on the internet:
Finally, if you want to run docker commands that are restricted in Pipelines, you can use a self-hosted runner instead for a specific Pipelines step:
If you use a Linux Docker runner, you will need to define a custom dind image for the Docker service, so that you won't run into the same restrictions as cloud-based builds:
Please feel free to let me know if you have any questions.
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You are very welcome. Please feel free to reach out if you have any questions!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
As an update, the custom docker image I created and hosted on the hub works.
I have more control over the process. However, Bitbucket does not cache it and rather re-builds the image every time for no apparent reason.
Not really effective to do a 3 minute building for 10 seconds of tests.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Pasquale,
I'm not really sure which image you are referring to now and I'm a bit confused by mentions of "building" and "re-builds".
In our earlier discussion we were talking about the Docker image that is used as a build container for a certain Pipelines step and the option of building and using a custom Docker image. If you want such a custom Docker image as a build container, there is no need to build it every time in Pipelines, unless you want different dependencies very often. You can build it once locally, push it to a Docker registry and then reference it in your bitbucket-pipelines.yml.
For images used as a build container or service containers, we cache automatically all public Docker Hub images used in Pipelines.
Are you talking about the build container's image, or about a different image you build with docker compose build during the step?
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thank you for replying.
My bad. What I mean is not "build" but rather "clone".
Bitbucket clones the docker image every time at the beginning of the pipeline.
Second issue is that despite having defined a cache with no automatic invalidation:
```
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Pasquale,
Since you're talking about the beginning of the pipeline, I assume you are referring to the Docker image that is used as a build container for a Pipelines step. Are you hosting the image you use as a build container on Docker Hub or a different registry? And is it a public or a private image? If it is a public image from Docker Hub, we cache these so the image will be pulled from our cache. If it is a private image or hosted on a different registry, we do not support caching for these images when they are used as build containers.
Regarding the second issue, there are a few things to check:
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi. Thank you and sorry for the late reply.
My image is public, but Bitbucket does not cache it.
To answer the second part:
1. The dependency caches are empty. Nothing listed.
2. Indeed I've found the message:
Cache "pythonvenv: ~/venv": Skipping upload for empty cache
I've found that the directory path was wrong. This solves issue 2. Thank you for pointing me in the right direction!
Still looking for a solution for problem 1, i.e. caching the docker image. Its size is about 500 MB.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
It is public on docker hub. Refer to decadenza/exicentrum-testenv
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
This is what I get in the build teardown
The docker cache will consist of all unique layers from the following images:
Usage: grep [OPTION]... PATTERNS [FILE]...
Try 'grep --help' for more information.
"docker image save" requires at least 1 argument.
See 'docker image save --help'.
Usage: docker image save [OPTIONS] IMAGE [IMAGE...]
Save one or more images to a tar archive (streamed to STDOUT by default)
chmod: cannot access '/opt/atlassian/pipelines/agent/cache/docker/docker.tar': No such file or directory
Docker images saved to cache
Cache "docker: docker.tar": Skipping upload for empty cache
Searching for test report files in directories named [test-results, failsafe-reports, test-reports, TestResults, surefire-reports] down to a depth of 4
Finished scanning for test reports. Found 0 test report files.
Merged test suites, total number tests is 0, with 0 failures and 0 errors.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Pasquale,
It's good to hear that issue 2 is resolved!
About issue 1, you posted info from the Build teardown. The docker cache mentioned in the Build teardown is for caching images that you use during a Pipelines build. It has nothing to do with the caching of an image that is used as a build container in a Pipelines steps.
So, now I have to go back to my previous question, and ask again, where exactly do you use the DockerHub image decadenza/exicentrum-testenv that you expect to be cached? Which of the following two is the case?
(1) Do you use it as a build container for a Pipelines step, similarly to one of the two following definitions?
image: decadenza/exicentrum-testenv
pipelines:
default:
- step:
script:
- echo "Hello, World!"
pipelines:
default:
- step:
image: decadenza/exicentrum-testenv
script:
- echo "Hello, World!"
(2) Or do you use this image during your Pipelines build in a docker command, e.g. something like the following (or with a different docker command)?
pipelines:
default:
- step:
script:
- docker pull decadenza/exicentrum-testenv
Kind regards,
Theodora
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello @Pasquale Lafiosca
From my experience, using Docker Compose and Python 3.11 in Bitbucket Pipelines requires careful image selection and setup. The recommended atlassian/default-image:4 works, but it’s inefficient to manually install Python.
Consider creating a custom Docker image that includes both Python 3.11 and Docker Compose pre-installed to streamline your pipeline.
For the error with tee, ensure that all required dependencies are installed in your pipeline’s environment. If the issue persists, consult Docker documentation or contact Atlassian support at https://support.atlassian.com/contact/#/ for guidance on optimizing your setup.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.