Data exchange between python script and bash script
walters.justin01 at gmail.com
Tue Apr 4 13:58:36 EDT 2017
On Tue, Apr 4, 2017 at 10:39 AM, Venkatachalam Srinivasan <
venkatachalam.19 at gmail.com> wrote:
> Thanks for the answer. I need bash for connecting data exchange between
> two python scripts. To be more specific, data from one script has to be
> passed to the another script. You are right, I don't need the data in the
> bash other than to pass the obtained data to the another script which uses
> the data for further analysis.
> Regarding using the json file, I am new to this. A naive question is that
> if the data is too large then is json file is easy to handle? Is json file
> for large data is not computationally expensive? I am not using textfile
> for the same reason of being computationally expensive.
> Venkatachalam Srinivasan
It could be expensive to create a dictionary from the json file depending
on the amount of data.
Alternatively, you could use a Unix socket to transmit the data between
if you're on a linux or OSx machine. This is probably the best option as
you can send the dictionary
object itself. This would eliminate the need for the bash script entirely.
The first script builds the
dictionary object and sends it to the socket. The second script is
listening on the socket and receives
the dictionary object. Python's standard library has built in support for
unix sockets. This means you
can actually run two parallel Python instances and communicate between
them. Assuming that
the first script can build the dictionaries before the second script is
done processing them, you will
probably need some kind of task queue. Celery: http://www.celeryproject.org/
tends to be a good solution
for this problem. In fact, Celery is probably the simplest option now that
I think about it.
Another alternative is using a SQLite database which Python has built in
support for. Should be
a bit faster and less memory-intensive.
More information about the Python-list