Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

How can I deploy to a specific direcory in my S3 bucket

3baileys
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
November 6, 2019

I'm using the aws-s3-deply pipe to deploy artifacts from my pipeline build to my S3 bucket and it works great.

(https://bitbucket.org/atlassian/aws-s3-deploy/src/master/)

I would like to be able to deploy to a sub-folder in the s3 bucket rather than the root directory.  I tried adding a subdirectory specification to the S3_BUCKET variable definition but that did not work.

 

1 answer

0 votes
Oleksandr Kyrdan
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 16, 2020

Hi @3baileys ,

Thank you for your feedback!
Can you provide us with your pipelines configuration to help you?


Cheers,
Alex

bibrin
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
January 16, 2020

Here's a minimal configuration which puts a time stamped example file on my S3 bucket in the root directory (formatting may be messed up):

image:
    name: bibrin/fruitloops

clone:
    depth: full

pipelines:

    branches:
        test:
               - step:
                   script:
                      - echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
                      - pipe: atlassian/aws-s3-deploy:0.2.1
                         variables:
                               AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
                               AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
                               AWS_DEFAULT_REGION: "us-east-1"
                               S3_BUCKET: "s3-bucket-name"
                               LOCAL_PATH: "$(pwd)"
                      artifacts:
                         - 2020*.txt

 

What I'd like to do is put the file into a subdirectory.  Perhaps by setting a REMOTE_PATH: "subdir" variable?

Oleksandr Kyrdan
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
January 28, 2020

@3baileys ,

S3_BUCKET parameter can be 's3-bucket-name/logs' and files will store in the "logs" of s3-bucket-name:

script:
- echo "Pipeline test" > "$(date +"%Y_%m_%d_%I_%M_%p").txt"
- pipe: atlassian/aws-s3-deploy:0.3.5
variables:
AWS_ACCESS_KEY_ID: $AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY: $AWS_SECRET_ACCESS_KEY
AWS_DEFAULT_REGION: 'us-east-1'
S3_BUCKET: 's3-bucket-name/logs'
LOCAL_PATH: $(pwd)
EXTRA_ARGS: "--exclude=* --include=2020*.txt"

Also, better to use the newest pipe's version.

Like # people like this
rickvschalkwijk
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
March 31, 2020

@Oleksandr Kyrdanis it a possibility to update the README.md in the pipes repository?

Like Oleksandr Kyrdan likes this
rickvschalkwijk
I'm New Here
I'm New Here
Those new to the Atlassian Community have posted less than three times. Give them a warm welcome!
March 31, 2020

@Oleksandr KyrdanI've got a branch ready for a PR but unfortunately I've got no rights to push anything.

Oleksandr Kyrdan
Atlassian Team
Atlassian Team members are employees working across the company in a wide variety of roles.
March 31, 2020

Hi @rickvschalkwijk ,

good suggestion, thank you! We'll update the Readme for the aws-s3-deploy pipe.

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events