Hi,
I have a pipeline with 2 steps that use pipes and one of them needs a lot of memory.
This is a simplify version of my pipeline:
definitions:
services:
docker: memory: 7128
steps:
- step: &deploy
name: Deploy Heroku
script:
- pipe: atlassian/heroku-deploy:1.1.4
- step: &whitesource
name: WhiteSource Scan
size: 2x
services:
- docker
script:
- pipe: WhitesourceSoftware/whitesource-scan:1.3.0
pipelines:
default:
- step: *deploy
custom:
whitesource:
- step: *whitesource
As far as i now the pipes are executed as docker in docker so i just set the maximum memory and a 2x size.
The whitesource pipe works fine using a 2x container and allocating 7128mb for the pipe but the deploy cannot be run because is a x1 container and when the system try to allocate the memory for the pipe (the internal docker) there is not enough and i get this error
A step does not have the minimum resources needed to run (1024 MB). Services on the current step are consuming 7128 MB
I tried to set the memory for the service inside the step but it didn't work. I'm not sure if it is not possible or I'm using the wrong syntax.
As a workaround I can use a size x2 in the other step but It is not needed and I'd like to be able to set the memory just for the whitesource step.
Thanks.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.