Crates.io | pgdump2sqlite |
lib.rs | pgdump2sqlite |
version | 0.2.0 |
source | src |
created_at | 2023-11-15 01:23:18.145238 |
updated_at | 2023-11-21 00:52:46.897942 |
description | use a pgdump to create a sqlite db |
homepage | |
repository | https://github.com/scratchmex/pgdump2sqlite |
max_upload_size | |
id | 1035609 |
size | 39,734 |
use a pg_dump to create a sqlite db
the objective is to use the dump AS IS. other solutions can be used but you need to strip the schemas like public.table_name
from the statements.
pdump2sqlite pg_dump_file.<tar or sql> extracted_db.sqlite
Usage: pgdump2sqlite [OPTIONS] <PGDUMP_FILENAME> <SQLITE_FILENAME>
Arguments:
<PGDUMP_FILENAME> the file of the dump. can be .sql or .tar
<SQLITE_FILENAME>
Options:
-f delete the dst sqlite file if exists
-h, --help Print help
for me, using a 16 MB tar dump with 39 tables and ~500K rows it takes 0.4 seconds. I would say pretty fast
create table
instructioncopy .. from <stdin or path>
.sql
) or tar dumpcheck the // TODO:
comments
support insert into
statement (even tough this is not the default behavior and it takes much more space, don't do it)
parse with pest using a buffer. see pest: Support for streaming input
get rows for the copy lazily, don't read the whole file but use a generator (like in python) to return each row (I don't know how to do this)
map f
and t
values to 0
and 1
in Bool dtype
support directory
, compressed tar and custom
dump type
have test data for the test (I have only locally but can't upload)
inspired by the scala version postgresql-to-sqlite