I've tried to replicate the bitbucket pipeline build locally - and can get it working so my integration tests are running and passing with Docker. However once I push to Bitbucket those same tests fail.
I think the answer is something simple - I just can't figure it out.
It's a Django project/app and I use Selenium to launch firefox and chrome browsers to test against.
I have the following `docker-compose.yml` and this works locally:
version: "3.3"
services:
python:
image: "my_user/python:3.8"
build:
context: ./docker
dockerfile: python.dock
volumes:
- ".:/my_project"
environment:
- DJANGO_DEBUG=True
- my_project_SECRET_KEY=1234_secret_key_4321
- BITBUCKET_CI=True
ports:
- "8000:8000"
links:
- hub
firefox:
image: selenium/node-firefox
volumes:
- /dev/shm:/dev/shm
depends_on:
- hub
environment:
HUB_HOST: hub
chrome:
image: selenium/node-chrome
volumes:
- /dev/shm:/dev/shm
depends_on:
- hub
environment:
HUB_HOST: hub
hub:
image: selenium/hub
ports:
- "4444:4444"
The `python.dock` file is very simple:
FROM python:3.8
SHELL ["/bin/bash", "-c"]
RUN pip install virtualenv
CMD ["tail", "-F", "-n0", "/etc/hosts"]
Then I launch an interactive bash shell for that container and install requirements and run pytest.
My `bitbucket-pipelines.yml` file is also pretty simple/straightforward:
image: python:3.8
pipelines:
branches:
dev:
- step:
name: Build project & run pytest
caches:
- pip
services:
- hub
script: # Modify the commands below to build your repository.
- pip install virtualenv
- virtualenv venv
- source venv/bin/activate
- apt-get install gcc libc-dev g++ libffi-dev libxml2 unixodbc-dev gnupg2 -y
- pip install -r requirements-dev.txt
- pytest
definitions:
services:
firefox:
image: selenium/node-firefox
volumes:
- /dev/shm:/dev/shm
depends_on:
- hub
environment:
HUB_HOST: hub
chrome:
image: selenium/node-chrome
volumes:
- /dev/shm:/dev/shm
depends_on:
- hub
environment:
HUB_HOST: hub
hub:
image: selenium/hub
ports:
- "4444:4444"
I utilize `pytest` for my testing and in the setup of the browser for my tests it looks like this:
HUB_URL = "http://hub:4444/wd/hub"
chrome_capabilities = DesiredCapabilities.CHROME.copy()
browser = webdriver.Remote(
command_executor=HUB_URL, desired_capabilities=chrome_capabilities
)
This launches the appropriate browser and my tests all pass no problem using Docker locally. I believe it's because I use `links` to attach the two containers to each other - so I can use the service name in the URL and it gets routed appropriately.
When I run my CI on Bitbucket the tests fail.
This is the failure message:
try:
> conn = connection.create_connection(
(self._dns_host, self.port), self.timeout, **extra_kw
)
venv/lib/python3.8/site-packages/urllib3/connection.py:159:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
address = ('hub', 4444), timeout = <object object at 0x7f8bd0757630>
source_address = None, socket_options = [(6, 1, 1)]
def create_connection(
address,
timeout=socket._GLOBAL_DEFAULT_TIMEOUT,
source_address=None,
socket_options=None,
):
"""Connect to *address* and return the socket object.
Convenience function. Connect to *address* (a 2-tuple ``(host,
port)``) and return the socket object. Passing the optional
*timeout* parameter will set the timeout on the socket instance
before attempting to connect. If no *timeout* is supplied, the
global default timeout setting returned by :func:`getdefaulttimeout`
is used. If *source_address* is set it must be a tuple of (host, port)
for the socket to bind as a source address before making the connection.
An host of '' or port 0 tells the OS to use the default.
"""
host, port = address
if host.startswith("["):
host = host.strip("[]")
err = None
# Using the value from allowed_gai_family() in the context of getaddrinfo lets
# us select whether to work with IPv4 DNS records, IPv6 records, or both.
I believe the reason for this has to do with using `http://hub:4444/wd/hub` as the URL.
In pipelines what should I be using for that URL? I've tried `localhost` and `127.0.0.1` with no luck.
I am using Docker to try to replicate the Pipelines environment locally so I don't burn through minutes if not necessary. But now that it works locally - I'm at a loss on how to get this going in Bitbucket Pipelines.
@cgoodwin Hope you alright. I know its a long time since this post. Were you able to figure it out ?
on bitbucket-piplines, all services run on the same host `127.0.0.1`.
try replacing `hub` with `127.0.0.1`.
image: python:3.8
pipelines:
branches:
dev:
- step:
name: Build project & run pytest
caches:
- pip
services:
- hub
script: # Modify the commands below to build your repository.
- pip install virtualenv
- virtualenv venv
- source venv/bin/activate
- apt-get install gcc libc-dev g++ libffi-dev libxml2 unixodbc-dev gnupg2 -y
- pip install -r requirements-dev.txt
- pytest
definitions:
services:
firefox:
image: selenium/node-firefox
volumes:
- /dev/shm:/dev/shm
depends_on:
- hub
environment:
HUB_HOST: 127.0.0.1
chrome:
image: selenium/node-chrome
volumes:
- /dev/shm:/dev/shm
depends_on:
- hub
environment:
HUB_HOST: 127.0.0.1
hub:
image: selenium/hub
ports:
- "4444:4444"
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@cgoodwin I know this is almost a year later, but did you ever get an answer to this? I'm suffering essentially the exact same issue
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hey Simon, I wish I had ever gotten an answer - but I did not.
I ended up rolling my own docker image where I install Chromedriver & Geckodriver on it and use that image in the pipeline to run the selenium tests using pytest.
It works, it does the job.
However we've got some projects that require testing multiple apps. I wanted to use the parallel feature of bitbucket pipelines and there's no way to cache that docker image I'm using... so in every step it re-downloads that image - which takes time.
So, while creating my own image to test with works - it has drawbacks.
If there was a way to have the pipeline download the image once - cache it and reuse it in all the steps it would be a great solution.
Sorry I'm late to reply - I wish I had a better answer.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
@cgoodwin Can you please provide me the Dockerfile which you have used to create the ChromeDriver and run the tests using selenium-grid from pipelines ?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Online forums and learning are now in one easy-to-use experience.
By continuing, you accept the updated Community Terms of Use and acknowledge the Privacy Policy. Your public name, photo, and achievements may be publicly visible and available in search engines.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.