Hello,
We are moving our builds from on-premise Bamboo to Atlassian cloud (bitbucket pipeline)
Our build involves linking third party and locally build libraries
For on-premise builds (using Bamboo) we have been hosting these libraries in a local repository . bamboo build downloads the libraries for the specifc software product and version being build from the local repository.
Since we're moving to Atlassian cloud (bitbucket pipeline), these libraries need to be available to be downloaded during builds running on bitbucket pipeline.
Our IT team does not encourage opening up our internal repository to internet owing to security reasons. They suggest that we take the needed libraries and host them elsewhere.
So What is the suitable option for hosting the needed libraries so they are available to build happening on bitbucket pipeline in Atlassian cloud ?
Some option that I thought are listed below - but they come with some cons
Option 1 : use a virtual machine in public cloud to host libraries. Con - in addition to charge of VM running, we will also incur cloud data egress charges when pipeline builds download the libraries
Option 2 : I wonder if its a good idea to create version specific docker container images and pre load the libraries on that docker image. When use container from that docker image for build it will have the libraries available, However the libraries total size runs in a few Gigabytes ( up to 3 GB) , is it a good idea to have 3GB docker container ? Con: We will have to continue to update docker containers as we refresh libraries of make new release
option 3 : STore libraries in a cloud bucket ( like a GCS bucket in google) and use them from there . The problem is I havent found a suitable pipeline to copy code from GCP bucket to the build environment
option 3 : STore libraries in a bitbucket itself. Con. It isnt a very good practice to store large libraries in GIT version control .. but your opinion is welcome
option 5 : Use local runners on premise. I wonder if this will again need the linux runner host machine to have an internet facing IP , And moreover will a runner have access to my local IPs ?
Does atlassian provide a place to host stuff needed for builds that the pipeline can download from ?
While considered that placing the libraries in a GCS bucket would be a suitable and easy option - I looked at the pipe google-cloud-storage-deploy . This pipe ( google-cloud-storage-deploy ) seems to be able to "deploy files and directories to google storage" . I need a mechanim to copy foles FROM GCS bucket TO the build pipeline . So THis isnt helping and I am still in limbo
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.