Forums

Articles
Create
cancel
Showing results for 
Search instead for 
Did you mean: 

Postgres docker in Pipelines: no server

sergio_pelin
Contributor
July 7, 2018

I'm trying to work with a Postgres container in Pipelines, adding it as service and using `psql`, but it fails for different reasons. Hence I reduced the `bitbucket-pipelines.yml` to the simplest version, but still cannot make it work:

pipelines:
default:
- step:
image: postgres:latest
script:
- psql

This returns the following:

+ psql
psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?

I tried to find a hint among answered questions, here and on Stackoverflow, but nothing seems to work.

I would greatly appreciate if someone could help me understand what am I doing wrong.

 

UPDATE: Looking at the log, when starting postgres as a service, I noticed this:

PostgreSQL init process complete; ready for start up.
2018-07-09 22:45:16.472 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
2018-07-09 22:45:16.472 UTC [1] LOG: listening on IPv6 address "::", port 5432
2018-07-09 22:45:16.476 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"

Is it correct that it's listening on the address "0.0.0.0"? I would expect that to be "127.0.0.1", according to 

Use services and databases in Bitbucket Pipelines:

These services share a network adapter with your build container and all open their ports on localhost. No port mapping or hostnames are required. For example, if you were using Postgres, your tests just connect to port 5432 on localhost. 

On the other hand, the log is saying that the server is listening on exactly the same Unix socket that the error message is mentioning.

Again, I would really appreciate if someone could help with this.

1 answer

1 accepted

3 votes
Answer accepted
sergio_pelin
Contributor
July 10, 2018

I kind of found the "solution". However, to be honest, I would prefer to have found the answer in the Pipelines documentation and save time.

I don't know the internals of Pipelines, so any comment giving more details and explaining the issue better is totally welcome.

It seems that Postgres in Pipelines is not accessible if defined as a normal image (not as a service). If you define your `bitbucket-pipelines.yml` like this:

pipelines:
default:
- step:
image: postgres:latest
script:
- psql -U postgres -c "\du"

the container doesn't seem to start (judging by the log), and obviously there's no access to the database, because it's not there. It's a bit strange, I would expect to have the image that I define launched and to have access to what's installed on the container.

Conclusion: Postgres docker works in Pipelines only when defined as a service.

When initiated as a service, it is accessible for `psql` commands, but you have to use the remote access format, even when the host is 127.0.0.1 (probably due to linking). When you use `psql` (as opposed to connecting Postgres through a driver, from an application or test), it probably happens from the host machine, hence you need to make sure it has the client installed:

pipelines:
default:
- step:
script:
- apt-get update && apt-get install -y postgresql-client
- psql -h 127.0.0.1 -p 5432 -U postgres -c "\du"
services:
- postgres
definitions:
services: postgres:
image: postgres

The mapping to localhost is probably done by Pipelines. The standard Postgres container starts the database listening on '0.0.0.0:5432', hence replacing '127.0.0.1' with '0.0.0.0' will work too.

matt_baker February 16, 2019

can we use below code to populate postgres database for test runs

1.) docker run -v /tmp/my_scripts:/docker-entrypoint-initdb.d postgres

or

FROM postgres:9.4
RUN mkdir -p /tmp/psql_data/
COPY db/structure.sql /tmp/psql_data/COPY scripts/init_docker_postgres.sh /docker-entrypoint-initdb.d/

Suggest an answer

Log in or Sign up to answer
TAGS
AUG Leaders

Atlassian Community Events