Anyways... one of the tasks I'm trying to complete before the start of the next Hurricane season is to rewrite data collection suite of C programs into a nice 'simple' Python package. Being that my C programs depend on the flat-files (which as I already mentioned can be out of date) I want to remove them... well I want to remove the Hashtable step. Currently any of my programs that depend on the flat-files, has to first read the file and convert it to a big hashtable. This is great when it comes to accessing the data, but when the program has to do this every time it runs and considering the fact that is runs several times an hour.... well you can guess what I'm getting at. So what I want to do is access the database since this is what databases are design for... but I don't want to have to depend on the network connection being alive while I'm developing my Python package, so in steps Sqlite3. I know I could install Postgre and copy the desired tables to it, but that seems a bit overkill when all I want is a single table that has only a few 1000 entries.
The minute I started development, I was already thinking Sqlite, but initially I was thinking of dumping my csv files into the Sqlite database... problem, my csv files are not 'simple'. So the solution.... PG_DUMP! That is right... while I was looking through the Sqlite documentation I came across this:
And in the section where it discussed Postgre, it gives a quick short set of steps to convert the data, and yes it works...
So now all I have to do next is learn how to interact with Sqlite from within Python.... maybe not to you, but for me, this is fun!Export the data from your database as tab delimited text. Create one text file for every table in your database:
- PostgreSQL to SQLite3 After a little experimenting, I found a way to import data from PostgreSQL into SQLite, version 3, without having non-ASCII characters ruin things (something that getting pg_dump to export SQL statements did...). It could be enormously improved with a little scripting love, but as I only had one database with a couple of big tables to deal the below was fine, however... post improvements!
$ pg_dump -aTrim off crap from header and footer from each file, eg:-t > /tmp/ ...and so on...
$ nano -w /tmp/If required, create SQLite file and tables corresponding to tables in the original PostgreSQL database:
$ sqlite3Finally, import the dump files into SQLite, remembering to change the default import separator to asqlite> CREATE TABLE ...etc... :
$ sqlite3sqlite> .separator "\t" sqlite> .import