I have blogged earlier about loading Esri file geodatabase feature classes into SQLite database by using standard geoprocessing tools. This time, I’ve written a short sample that creates a SQLite database, reads a file geodatabase feature class rows, and then creates a new table and insert the features into it. Everything except reading file geodatabase (I use arcpy to access features with a cursor) is done using pure Python and SQL. The logic is rather simple – you read the schema of a geodatabase table and then mirror it inside a SQLite database table. Then you insert the features’ geometry and attributes into right columns.
I was pleased to observe excellent performance of data insertion and query processing. Of course, the initial data load is an overhead that will present in the script, but the efficiency of data manipulation using SQL and spatial analysis using ST_Geometry functions outweighs it, in my opinion. If you will work with rather larger datasets, you should consider tuning the performance of SQLite spatial indices because using ST_Geometry may be initially slow.
It is also worth noting that with some minor modifications, it would be possible to load a non-spatial file geodatabase table into an SQLite database table if you want to perform some SQL-based data aggregation. A very unique feature of SQLite database is that it is possible to generate such a database completely in memory without writing anything on disk. This is suitable for scripts where you want to do the processing on the fly without creating any additional scratch workspaces. Use the syntax for creating SQLite databases in memory (:memory:). I’ve done tests loading the spatial extension and some data and it worked fine.