I have two docker containers running
docker ps
results
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
0bfd25abbfc6 f_service:latest "/usr/local/start-fl 13 seconds ago Up 2 seconds 0.0.0.0:8081->8081/tcp flume
6a1d974f4e3e h_service:latest "/usr/local/start-al 2 minutes ago Up About a minute 0.0.0.0:8080->8080/tcp hadoop
Hadoop services are running in hadoop container [i.e datanode, namenode, jobtracker, tasktracker, secondarynamenode]
Flume services are running on flume contianer [i.e flume-agent]
I want to run hadoop commands on flume container.[i.e hadoop fs -ls /] How to do it? Any ideas?
I tried linking but i fail to achieve it.
RUN COMMAND for the containers:
docker run -it --name hadoop -p 8080:8080 h_service
jps on hadoop container shows all hadoop services
docker run -it -p 8081:8081 --name flume --link hadoop:hadoop f_service
jps on flume shows
jps and Application.[which is flume i guess]
If i execute any hadoop commands inside flume container, i am getting the following error
mkdir: Call From 282fc55ec08d/172.17.5.236 to localhost:8020 failed on connection exception:
java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org
/hadoop/ConnectionRefused
telnet localhost 8020
unable to connect to remotehost. same for 8080 also.
netstat
inside flume container
netstat -na
Active Internet connections (servers and established)
Proto Recv-Q Send-Q Local Address Foreign Address State
Active UNIX domain sockets (servers and established)
Proto RefCnt Flags Type State I-Node Path
netstat on hadoop container shows
netstat
Active Internet connections (w/o servers)
Proto Recv-Q Send-Q Local Address Foreign Address State
tcp 0 0 localhost:49096 localhost:8020 TIME_WAIT
tcp 0 0 localhost:49079 localhost:8020 ESTABLISHED
tcp 0 0 localhost:8020 localhost:49079 ESTABLISHED
tcp 0 0 c0c82bab5efd:54003 likho.canonical.com:80 TIME_WAIT
tcp6 0 0 localhost:8021 localhost:40735 ESTABLISHED
tcp6 0 0 localhost:40735 localhost:8021 ESTABLISHED
Active UNIX domain sockets (w/o servers)
Proto RefCnt Flags Type State I-Node Path
unix 2 [ ] STREAM CONNECTED 9223040
unix 2 [ ] STREAM CONNECTED 9223013
unix 2 [ ] STREAM CONNECTED 9222782
unix 2 [ ] STREAM CONNECTED 9222116
unix 2 [ ] STREAM CONNECTED 9221761
unix 2 [ ] STREAM CONNECTED 9221758
unix 2 [ ] STREAM CONNECTED 9221302
unix 2 [ ] STREAM CONNECTED 9221284
unix 2 [ ] STREAM CONNECTED 9220884
unix 2 [ ] STREAM CONNECTED 9220877
Where localhost:8020, i guess 8020 is from the specification of core-site.xml
Best Answer
This one has a simple solution. First, if you want to connect to port 8020 of your hadoop container, you should make sure that that port is exposed as well. Second, these containers each have their own loopback interface (localhost) and IP addresses. They are connected through the bridge network docker0 to the host's eth0 interface. So, you need to use the IP address that Docker injects into the flume container.
So these will start the containers properly:
But you will need to tell flume to connect to hadoop at "had00p" instead of "localhost." I used had00p here just to distinguish the alias it will have inside the container from the container name that you gave the container running hadoop.
Here is a plain example:
When Docker creates application links it injects a number of environment variables and adds a hostname to the /etc/hosts file of the linked container. It will also add firewall rules to allow communication between the two containers if you have inter-container communication disabled.