I have a Python client launching a subprocess in C++.
The C++ program runs several threads that need to report results to the Python client.
Knowing that both the Python client and the C++ subprocess are running on the same machine, what is the best way to communicate between them? Using communication through TCP or through files?
By communication through files I mean that the C++ side would write its results to several different JSON or XML files that the Python client would look for and parse.
Is it bad design to communicate through files? Is using TCP faster? What if my computer has a solid-state drive?
EDIT: I ended up using pipes (stdin, stdout). See this post: https://stackoverflow.com/questions/36748829/piping-binary-data-between-python-and-c
Best Answer
What you are trying to do is define a method for IPC, or inter-process communication. There are many, many ways to do this.
In general, the best methods for IPC provide the following benefits:
For these reasons, I typically pick TCP/IP.
As far as the payload of the TCP/IP communication, that can be anything you want: XML, JSON, serialized objects, whatever is convenient. Typically when mixing languages, however, I would go with something like XML or JSON which are platform-agnostic and human-readable (aids with debugging).
I also highly recommend writing test cases where you can plug your stream into a mock that produces or consumes data. That way you can test your interface on both ends without needing a full client/server system up and running.
Do not communicate using files unless there is literally no other way. I want to keep this answer short and to the point, so I will say this. I have worked with interfaces that were file-driven. I hated it each time, and always asked if there was a different way to do it. This is error-prone and klunky.