Data exchange between python script and bash script

Venkatachalam Srinivasan venkatachalam.19 at gmail.com
Wed Apr 19 17:00:29 EDT 2017


On Tuesday, April 4, 2017 at 7:58:56 PM UTC+2, justin walters wrote:
> On Tue, Apr 4, 2017 at 10:39 AM, Venkatachalam Srinivasan <
> venkatachalam.19 at gmail.com> wrote:
> 
> > Hi,
> >
> > Thanks for the answer. I need bash for connecting data exchange between
> > two python scripts. To be more specific, data from one script has to be
> > passed to the another script. You are right, I don't need the data in the
> > bash other than to pass the obtained data to the another script which uses
> > the data for further analysis.
> >
> > Regarding using the json file, I am new to this. A naive question is that
> > if the data is too large then is json file is easy to handle? Is json file
> > for large data is not computationally expensive? I am not using textfile
> > for the same reason of being computationally expensive.
> >
> > Thanks,
> > Venkatachalam Srinivasan
> > --
> > https://mail.python.org/mailman/listinfo/python-list
> >
> 
> It could be expensive to create a dictionary from the json file depending
> on the amount of data.
> 
> Alternatively, you could use a Unix socket to transmit the data between
> processes
> if you're on a linux or OSx machine. This is probably the best option as
> you can send the dictionary
> object itself. This would eliminate the need for the bash script entirely.
> The first script builds the
> dictionary object and sends it to the socket. The second script is
> listening on the socket and receives
> the dictionary object. Python's standard library has built in support for
> unix sockets. This means you
> can actually run two parallel Python instances and communicate between
> them. Assuming that
> the first script can build the dictionaries before the second script is
> done processing them, you will
> probably need some kind of task queue. Celery: http://www.celeryproject.org/
> tends to be a good solution
> for this problem. In fact, Celery is probably the simplest option now that
> I think about it.
> 
> Another alternative is using a SQLite database which Python has built in
> support for. Should be
> a bit faster and less memory-intensive.

Hi,

Thank you for a clearer explanation. I tried with sqlite and now it seems to work fine. Yet, we have to check with the real time system which may pose some problems to the amount of data (I guess). So, until now everything is fine on this. Thank you.


More information about the Python-list mailing list