Hi all, maybe somebody can help me understand either what I am doing wrong or what I am lacking for this.
I'm trying to configure a pipeline so that whenever a file is merged into my master branch(set as "development branch" in the repo settings) via a pull request it is zipped up and uploaded to an AWS S3 bucket. The problem I am running into is that I'm currently using the "atlassian/aws-s3-deploy:0.3.2" pipe with following configuration:
pipelines:
default:
- step:
script:
- pipe: atlassian/aws-s3-deploy:0.3.2
variables:
AWS_ACCESS_KEY_ID: ${AWS_ACCESS_KEY_ID}
AWS_SECRET_ACCESS_KEY: ${AWS_SECRET_ACCESS_KEY}
AWS_DEFAULT_REGION: ${AWS_REGION}
S3_BUCKET: 'mybucketname'
LOCAL_PATH: '.'
This just overwrites the entire bucket contents whenever any commit is made to the branch; merged via PR or not. Both of those things are bad.
What would a "bitbucket-pipelines.yml" file look like that only ran when code was a merged from an approved PR and zipped up only the files that were modified/created as part of that PR? My research thus far has lead me to a dead end! Any help is appreciated as you can probably tell I am new to bitbucket(and pipelines!), thanks in advance.
Also, you can add EXTRA_ARGS: '--exclude=* --include=*.zip' to upload only .zip file into s3.
More details, you can find in the AWS docs Use of Exclude and Include Filters .
Cheers,
Alex
Hi @Zachary Wallace ,
you can implement branch-workflows in you bitbucket-pipelines.yml.
Cheers,
Alex
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.