So I am writing my first pipe. It will generate a single file as an output. Given that the pipe runs in a container isolated from any other pipes /steps etc, how would I have another pipe take that file that was generated.
e.g.
1. create-file-pipe = results in pipe-file.zip
2. push-file-pipe = takes pipe-file.zip and do something with it
In addition, I'll know the filename (as it's made up of some parameters passed in) but if I could set like an output variable that would be helpful
EDIT: Is it as simple as using a variable like OUTPUT_PATH and that then is available between steps/pipes, assuming the consumer passes it using an artifact directive?
Thanks
Just want to share the way I resolved this in the end.
User passes a variable called output_folder.
I create a file to that folder in my pipe, and you can then use artifacts to hare between steps using a globbing pattern.
Simples
Hi @Mark Harrison will you share a code snippet from the pipe and the pipeline for how you did this?
I have been trying to do the same thing and have no luck getting it to work.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Rob Jahn
Yes of course!
The pipe I created makes use of one of our existing containers which outputs a file to the container (you can specify an optional folder)
The line in the pipe which does this looks like this:
run octo pack --id "$ID" --version "$VERSION" --format "$FORMAT" --basePath "$SOURCE_PATH" --outFolder "$OUTPUT_PATH" "${EXTRA_ARGS[@]}"
You can see the entire pipe here: https://bitbucket.org/octopusdeploy/pack/src/eb892075e4c204e2d4e13eb41da06efe998fd4f7/pipe/pipe.sh#lines-50
The resultant pipeline file snippet with the pipe looks like this:
- step:
name: Pack for Octopus
script:
- export VERSION=1.0.0.$BITBUCKET_BUILD_NUMBER
- pipe: octopusdeploy/pack:0.4.0
variables:
ID: ${BITBUCKET_REPO_SLUG}
FORMAT: 'Zip'
VERSION: ${VERSION}
SOURCE_PATH: '.'
OUTPUT_PATH: './out'
DEBUG: 'false'
artifacts:
- out/*.zip
You can see the whole file in our OctopusSamples repo: https://bitbucket.org/octopussamples/randomquotes-js/src/ba8eee56da6d53f03f6e0b40f7bdfd0c284ce079/bitbucket-pipelines.yml#lines-30
So, in the end, I didn't make use of the Shared pipe directory.
The main reason for this is that the resultant file would either be a .zip or .nupkg file and I didn't want the consumer to need to know the name of the file like was in the case for the Deploy an AWS Lambda function referenced here: https://confluence.atlassian.com/bitbucket/deploying-a-lambda-function-update-to-aws-967319469.html?_ga=2.99356254.1123536487.1584694967-394096649.1583964412
That being said, there is some drawbacks to my approach:
1. The output folder specified can't be written to more than once after it's shared as an artifact if it's created on the initial step. It appears read-only on any subsequent steps. Therefore creating an output folder in some cases is not ideal (where you want to share multiple files in a folder). I did a little bit of research and I originally tried to modify the folder permissions on subsequent steps and I believe trying to change the owner/permissions is restricted - something to do with docker user namespaces from what I could tell.
2. As a result of 1) this means that the use-case of my pipe is limited to sharing files between steps to unique folders (although you might be able to get this to work using the root directory)
Hope that helps, but I'd be happy to answer any other questions you may have :)
Thanks, Mark
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Yes this helps. There some other community questions threads showing the use of BITBUCKET_PIPE_SHARED_STORAGE_DIR which I found did not work outside the pipe. Like you, I was able to just create some file in the pipe and grab it use that file name as the artifact. then in a subsequent step, read in that artifact file.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Rob Jahn
I'm glad it helped. If I can help further feel free to drop me a message here :)
Thanks!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi @Mark Harrison , we do something similar in the example for deploying AWS Lambda functions https://confluence.atlassian.com/bitbucket/deploying-a-lambda-function-update-to-aws-967319469.html. Remember, that artifacts paths can't be dynamic, so you are not gonna be able to use environment variables when defining an atrifact.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
So if I understand correctly, I would need to specify a filename such as something like this:
# The pipe exports the newly published
# Lambda version to a file.
artifacts:
- pipe.meta.env
where pipe.meta.env could any filename I choose as the pipe creator. Could this be used to store a dynamic filename within its contents?
So If I had a pipe that took multiple variables and that's what constructs the filename could I then use the source directive to read in the filename I want to use in subsequent step/pipe?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Is the name pipe.meta.env a special name? I cant find that filename specified anywhere in the deploy aws lambda code
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I found it also unclear where this file is being created
pipe.meta.env
In that lambda repo you reference, https://bitbucket.org/atlassian/aws-lambda-deploy/src/master/ , the readme also shows this example of using the build in variable $BITBUCKET_PIPE_SHARED_STORAGE_DIR in the pipe:
then using it as a script step in the pipe....
- VERSION=$(jq --raw-output '.Version' $BITBUCKET_PIPE_SHARED_STORAGE_DIR/aws-lambda-deploy-env
which according to this -- the BITBUCKET_PIPE_SHARED_STORAGE_DIR is not usable outside the pipe
https://confluence.atlassian.com/bitbucket/advanced-techniques-for-writing-pipes-969511009.html
In my testing, I found the BITBUCKET_PIPE_SHARED_STORAGE_DIR did not have a value outside the pipe. Tested using "echo "BITBUCKET_PIPE_SHARED_STORAGE_DIR = $BITBUCKET_PIPE_SHARED_STORAGE_DIR"
At the moment, I am all set. I am using the solution listed I mention above: "create some file in the pipe and grab it use that file name as the artifact. Then in a subsequent step, read in that artifact file." I don't add in a folder path. The file just lives in the root folder with the checked out code.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.