Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Bitbucket Dependency Scanner Fails with "Container 'docker' Exceeded Memory Limit"

Zahidur Rahman
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
April 14, 2025

I'm using the Bitbucket dependency scanner to check for vulnerabilities, but I frequently encounter a Docker memory limit error. I've tried increasing the memory using size: 2x and size: 4x, both within the pipe and at the step level, but the issue persists. Strangely, the pipeline does succeed occasionally.

I also tried with suppressions file and without suppressions file but got same error.

The error message I receive is:
msg="error waiting for container: unexpected EOF"
It appears to be related to Docker exceeding the memory limit.

Any suggestions or help would be greatly appreciated.

Screenshot 2025-04-15 at 11.39.08 AM.png

image: php:7.2-fpm
pipelines:
  default:
     - step:
            name: 'Dependency scanner'
            script:
               - pipe: atlassian/bitbucket-dependency-scanner:0.7.0
            size: 2x
            variables:
                 NVD_API_KEY: "$NVD_API_KEY"
                 UPDATE_NVD_CACHE: 'true'
                 DEBUG: "false"
                 EXTRA_ARGS:
                      - "--format=HTML"
                      - "--failOnCVSS=7"
                      - "--nodeAuditSkipDevDependencies"
                      - "--nodePackageSkipDevDependencies"
                      - "--suppression=published-suppressions.xml"
              artifacts:
                    - dependency-check-report.*

2 answers

1 accepted

1 vote
Answer accepted
Theodora Boudale
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
April 16, 2025

Hi Zahidur and welcome to the community!

A pipe starts a new Docker container, so a pipelines step with the pipe uses a docker service for this purpose. Service containers get 1024 MB of memory by default, regardless of the step size. However, it is possible to allocate more memory to a service.

Since there are no other services on this step, you can allocate up to 3072 MB to the Docker service for a 1x size step and up to 7128 MB for a 2x size step.

A sample yml file with a 1x size step is the following:

definitions:
services:
docker:
memory: 3072

image: php:7.2-fpm
pipelines:
default:
- step:
name: 'Dependency scanner'
services:
- docker
script:
- pipe: atlassian/bitbucket-dependency-scanner:0.7.0
variables:
NVD_API_KEY: "$NVD_API_KEY"
UPDATE_NVD_CACHE: 'true'
DEBUG: "false"
EXTRA_ARGS:
- "--format=HTML"
- "--failOnCVSS=7"
- "--nodeAuditSkipDevDependencies"
- "--nodePackageSkipDevDependencies"
- "--suppression=published-suppressions.xml"
artifacts:
- dependency-check-report.*

If this is not enough, you can use a 2x step and increase the docker service memory further, up to 7128 MB. The maximum memory you can allocate to the services of a step is equal to the total memory of the step minus 1024 MB of memory (that is required for the build container).

You can see the memory allocated to a step depending on its size here (the memory is in GB):

The size is an option of the step, not of the pipe, so if you use e.g. 2x, you need to add size: 2x at the same level as the name, services, script, and artifacts of the step and not before the pipe variables.

Please keep in mind that 2x steps use twice the build minutes of 1x steps. This is also mentioned in the documentation.

If you happen to have other steps that use a Docker service and you don't want to allocate that much memory to them, you can configure multiple Docker services with different memory limits:

Please feel free to let me know how it goes and if you have any questions!

Kind regards,
Theodora

Zahidur Rahman
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
April 16, 2025

Thanks, it's working now. After docker has been added as a service definition.

image: php:7.2-fpm
pipelines:
default:
- step:
name: "Dependency scanner"
services:
- docker
script:
- pipe: atlassian/bitbucket-dependency-scanner:0.7.0
variables:
UPDATE_NVD_CACHE: "true"
NVD_API_KEY: $NVD_API_KEY
EXTRA_ARGS:
- "--format=HTML"
- "--failOnCVSS=7"
- "--nodeAuditSkipDevDependencies"
- "--nodePackageSkipDevDependencies"
- "--suppression=published-suppressions.xml"
artifacts:
- dependency-check-report.*
definitions:
services:
docker:
memory: 2048


0 votes
Shaun Jones
Community Champion
April 14, 2025

I've seen these memory issues with the Bitbucket dependency scanner. Sometimes switching to 4x will work but i've seen plenty of times where we had to bump it to 8x.

One trick to try is disabling UPDATE_NVD_CACHE if you don’t need it every run, to save some memory.

And it makes sense that the pipe works sometimes and not others because the memory load fluctuates. Hope that helps... let me know if you need some more details. It's been a little bit since i fix this last, but i can walk you through the steps i remember.

Zahidur Rahman
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
April 15, 2025

I also tried with 8x and disabled UPDATE_NVD_CACHE but same got same error. 

Like Shaun Jones likes this
Shaun Jones
Community Champion
April 15, 2025

Ok so its still throwing off an unexpected EOF... makes me think its java not docker memory. Make sure you're on the latest pipe version, i think they had notes in a recent release about fixing crash issues under load. If thats not it, here's one more thing to try -- manually set JAVA_OPTS in the script to control the Java heap (giving it a fixed range so it doesn't spike and crash).

JAVA_OPTS='-Xms2g -Xmx8g'

 

Suggest an answer

Log in or Sign up to answer
DEPLOYMENT TYPE
CLOUD
TAGS
AUG Leaders

Atlassian Community Events