Recommended Import Methods
When importing data from other systems to DuckDB, there are several considerations to take into account. We recommend importing using the following order:
- For systems which are supported by a DuckDB scanner extension, it's preferable to use the scanner. DuckDB currently offers scanners for MySQL, PostgreSQL, and SQLite.
- If there is a bulk export feature in the data source system, export the data to Parquet or CSV format, then load it using DuckDB's Parquet or CSV loader.
- If the approaches above are not applicable, consider using the DuckDB appender, currently available in the C, C++, Go, Java, and Rust APIs.
- If the data source system supports Apache Arrow and the data transfer is a recurring task, consider using the DuckDB Arrow extension.
Methods to Avoid
If possible, avoid looping row-by-row (tuple-at-a-time) in favor of bulk operations. Performing row-by-row inserts (even with prepared statements) is detrimental to performance and will result in slow load times.
Bestpractice Unless your data is small (<100k rows), avoid using inserts in loops.
© 2025 DuckDB Foundation, Amsterdam NL