Fast lookup of bulky "table"
rbowman
bowman at montana.com
Sun Jan 15 15:54:18 EST 2023
On Sun, 15 Jan 2023 08:27:29 -0500, Dino wrote:
> Do you have any idea about the speed of a SELECT query against a 100k
> rows / 300 Mb Sqlite db?
https://www.sqlite.org/speed.html
The site is old but has a number of comparisons. I have not used SQLite
with Python yet but with both C and C# I've been impressed with the speed
versus Postgres or MSSQL. One thing to watch is insertions. By default
each insertion is a transaction. There is a dramatic speed increase for
multiple insertions if you explicitly start a transaction, do the inserts,
and end the transaction.
My usage is a result of ESRI's dropping their C++/C# Engine API. My new
approach uses queries against the AcrGIS Server REST interface for much of
the functionality but some spatial queries can be replaced with
predetermined tabular data rather than runtime spatial queries. For
example, for a given dataset you can determine the coordinates of every
intersection beforehand and the intersection of PINE ST and MAPLE AVE
becomes a simple search in the SLQite database.
ESRI's ArcPy is the closest replacement for the legacy C++ API so I assume
in the future I will be using it in conjunction with SQLite. The actual
geodata will still need to be in a spatially aware RDMBS like SQL Server
or PostgreSQL/PostGIS but SQLite so far is the fastest and easiest to
implement for non-spatial data. Also, it is in the public domain which
avoids the complexities of MySQL and its derivatives for commercial
applications.
More information about the Python-list
mailing list