Connecting service running in one container with another: Docker Way

Connecting service running in one container with another: Docker Way

Beginning

Hello everyone! I am writing about an interesting thing in Docker that I learned today. I was trying to connect to one service running docker via another. The complete story is I was experimenting with Airflow service via Docker on my machine and was trying to run a DAG task that connects to ElasticSearch( distributed RESTful search engine). So to make this work, I ran Elasticsearch via docker on my machine. Now I have my host machine with two docker containers up and running.

I could connect to Elasticsearch running in docker from my host via localhost the address 127.0.0.1. I followed the instructions on this page to run single-node Elasticsearch on my host machine.

sa@Air ~ % curl http://127.0.0.1:9200/_cat/health
1649845689 10:28:09 docker-cluster green 1 1 3 3 0 0 0 0 - 100.0%

Problem arises

As I mentioned earlier I want to connect to Elasticsearch container via my Airflow running container. But when I tried to execute the above command in Airflow docker it started to throw connection errors

root@d5f91d9956f9 # curl http://127.0.0.1:9200/_cat/health
curl: (7) Failed to connect to 127.0.0.1 port 9200: Connection refused

I started checking about connecting services running in one docker container with another. I got answers that suggested creating a network in docker and connecting both containers to the network. So I got the basic idea that I can create a network in docker and I can connect both of my containers to that network and with that, I can make both docker containers communicate.

I was running my Airflow service using breeze(dev-tool for Apache Airflow) which under the hood runs the docker container of Airflow. So to confirm if I am going in the right direction I asked this question in the #development channel in the Airflow Slack channel and got help from one of the committers( Thanks Kamil Bregula)

A solution that worked for me

Following are the steps done to establish communication between containers,

  1. Find the network name of the Airflow service running container
sa@Air ~ % docker inspect d5f91d9956f9 -f "{{json .NetworkSettings.Networks }} "
{
  "docker-compose_default": {
    "IPAMConfig": null,
    "Links": null,
    "Aliases": [
      "d5f91d9956f9"
    ],
    "NetworkID": "54789356324f4b6fff64d4c2218d8b6afe15a742cdbe2c7deaa85eb1b4fc945d",
    "EndpointID": "1b85baa6dec349dd5e8032be50dd65ed1b7c62cf0c30353f291784ab4ae973ae",
    "Gateway": "172.28.0.1",
    "IPAddress": "172.28.0.2",
    "IPPrefixLen": 16,
    "IPv6Gateway": "",
    "GlobalIPv6Address": "",
    "GlobalIPv6PrefixLen": 0,
    "MacAddress": "02:42:ac:1c:00:02",
    "DriverOpts": null
  }
}

I have run docker inspect a command to get information about the container. Here d5f91d9956f9(airflow container id) is the container id(it can be replaced with container name too) and I have used {{json .NetworkSettings.Networks }} to extract only network information. The JSON key docker-compose_default is the network name of the Airflow running container. When running Airflow via breeze, it runs a docker container with its network name( It's just a network name and you have to specify the driver name like bridge or overlay to create it). To know more details, check this docker guide.

  1. Find the network name of the Elasticsearch container
sa@Air ~ % docker inspect 836c3e204ae9 -f "{{json .NetworkSettings.Networks }}"
{
  "bridge": {
    "IPAMConfig": null,
    "Links": null,
    "Aliases": null,
    "NetworkID": "218bf98b3e01c98e543843d50a89db2f11f9b5504937cba162a058b936f532ca",
    "EndpointID": "",
    "Gateway": "",
    "IPAddress": "",
    "IPPrefixLen": 0,
    "IPv6Gateway": "",
    "GlobalIPv6Address": "",
    "GlobalIPv6PrefixLen": 0,
    "MacAddress": "",
    "DriverOpts": null
  }
}

Here 836c3e204ae9 is the Elasticsearch container ID. The JSON key is bridge. Bridge is the default network driver in the docker engine and the Elasticsearch container runs on the default one.

  1. Add the Airflow network to the ElasticSearch container. By doing this both containers can communicate via a network. Following is the command that is used to link the network name to a container:
sa@Air ~ % docker network connect docker-compose_default elasticsearch

Once this command is run the Elasticsearch container is connected to the network of the Airflow container. To confirm it I executed the docker inspect command:

sa@Air ~ % docker inspect 836c3e204ae9 -f "{{json .NetworkSettings.Networks }}"
{
  "bridge": {
    "IPAMConfig": null,
    "Links": null,
    "Aliases": null,
    "NetworkID": "218bf98b3e01c98e543843d50a89db2f11f9b5504937cba162a058b936f532ca",
    "EndpointID": "92b109ac33b6c9d8964ff7583949b5a332338ee8aac9a8d35bdd9bff35cd39cd",
    "Gateway": "172.17.0.1",
    "IPAddress": "172.17.0.3",
    "IPPrefixLen": 16,
    "IPv6Gateway": "",
    "GlobalIPv6Address": "",
    "GlobalIPv6PrefixLen": 0,
    "MacAddress": "02:42:ac:11:00:03",
    "DriverOpts": null
  },
  "docker-compose_default": {
    "IPAMConfig": {},
    "Links": null,
    "Aliases": [
      "836c3e204ae9"
    ],
    "NetworkID": "54789356324f4b6fff64d4c2218d8b6afe15a742cdbe2c7deaa85eb1b4fc945d",
    "EndpointID": "c9ae0a8e5c3a7ecc6e105d6c501dc223aa35ab348ba1a14d9e90fe53f170d363",
    "Gateway": "172.28.0.1",
    "IPAddress": "172.28.0.3",
    "IPPrefixLen": 16,
    "IPv6Gateway": "",
    "GlobalIPv6Address": "",
    "GlobalIPv6PrefixLen": 0,
    "MacAddress": "02:42:ac:1c:00:03",
    "DriverOpts": {}
  }
}

You can see the network detail of Elasticsearch contains network names bridge( its original default network) and docker-compose_default (the network connected to establish communication between containers). Here the IPAddress for docker-compose_default is "172.28.0.3". We are going to use this IP to connect to the Elasticsearch container from the Airflow running container

This way you can connect both containers using the network.

Finally, I have to test if my connection is working. For this, I am executing the below command in my Airflow running container that collects the indices from my Elasticsearch and it works like a charm

root@d5f91d9956f9:/opt/airflow/tests/system/providers/elasticsearch# curl http://172.28.0.3:9200/_aliases

{"my-world":{"aliases":{}},"my-airflow":{"aliases":{}},"flights":{"aliases":{}}

Note that in the above command, I am pointing to Elasticsearch with IPAddress 172.28.0.3 rather than 127.0.0.1

You can also confirm if both containers are communicating by running the below docker command

 sa@Air ~ %  docker network inspect docker-compose_default -f "{{json .Containers }}"
{
  "836c3e204ae91e9be671746b1a073f1d523d22a317b4e9949585d0960cd85ee5": {
    "Name": "elasticsearch",
    "EndpointID": "c9ae0a8e5c3a7ecc6e105d6c501dc223aa35ab348ba1a14d9e90fe53f170d363",
    "MacAddress": "02:42:ac:1c:00:03",
    "IPv4Address": "172.28.0.3/16",
    "IPv6Address": ""
  },
  "d5f91d9956f9c5aabb644c84319abe1e33a59e13a756ba5d6547ef15372d073d": {
    "Name": "docker-compose_airflow_run_e9e73f48a9e2",
    "EndpointID": "1b85baa6dec349dd5e8032be50dd65ed1b7c62cf0c30353f291784ab4ae973ae",
    "MacAddress": "02:42:ac:1c:00:02",
    "IPv4Address": "172.28.0.2/16",
    "IPv6Address": ""
  }
}

This command will list the containers running within the network docker-compose_default.

I hope you found this information useful. I will share more about things that I learned in my upcoming blogs. Have a great day! :)

**PC: Photo by Pawel Czerwinski on Unsplash **