Table to CSV
A Rust command-line tool that converts SQL database dumps to CSV files. Table to CSV parses SQL files containing CREATE TABLE
and INSERT
statements and extracts the data into separate CSV files for each table.
Features
- SQL Parsing: Automatically detects table schemas from
CREATE TABLE
statements - Data Extraction: Extracts data from
INSERT
statements and converts to CSV format - Multiple Tables: Handles databases with multiple tables, creating separate CSV files
- Date Filtering: Filter rows by date range using
--date-filter
option - Parallel Processing: Fast multi-threaded CSV generation using Rayon
- Robust Value Parsing: Properly handles quoted strings, escaped characters, and SQL functions like
replace()
- Error Handling: Comprehensive error handling with helpful error messages
- Cross-Platform: Built with Rust for excellent performance and cross-platform compatibility
Installation
Prerequisites
- Rust (latest stable version)
Install from crates.io
The easiest way to install Table to CSV is via cargo:
This will install the table-to-csv
binary to your Cargo bin directory (typically ~/.cargo/bin/
).
Building from Source
- Clone or download this repository
- Navigate to the project directory
- Build the project:
The executable will be created at target/release/table-to-csv
.
Usage
Basic Usage
With Date Filtering
Or if built from source:
Or using Cargo (when developing):
Examples
# Using the installed binary (after cargo install table-to-csv)
# Convert a database dump to CSV files (when developing)
# Using the compiled binary from source
# Filter by date range (column name: createdAt, from 2024-01-01 to 2024-12-31)
# Filter from a start date to today (end date defaults to today)
# Using Cargo with date filter
Date Format: YYYY-MM-DD
Note: If end_date
is not provided, it defaults to today's date
How It Works
- Schema Detection: Parses
CREATE TABLE
statements to extract table names and column definitions - Data Extraction: Finds
INSERT
statements for each table and extracts the values - Value Processing: Handles SQL-specific formatting including:
- Quoted strings (single and double quotes)
- Escaped characters (
''
for single quotes,""
for double quotes) - SQL functions like
replace()
for JSON data
- Date Filtering (optional): Filters rows based on date column values within specified date range
- Parallel Processing: Uses Rayon to process multiple tables concurrently for better performance
- CSV Generation: Creates properly formatted CSV files with headers and data
Example
Given a SQL file like test.sql
:
INTEGER PRIMARY KEY,
name TEXT NOT NULL,
email TEXT UNIQUE
);
INSERT INTO users VALUES(1, 'Alice Smith', 'alice@example.com');
INSERT INTO users VALUES(2, 'Bob Johnson', 'bob@example.com');
(
id
Table to CSV will generate users.csv
:
id,name,email
1,Alice Smith,alice@example.com
2,Bob Johnson,bob@example.com
Supported SQL Features
CREATE TABLE
statements with various column typesINSERT INTO ... VALUES
statements- Single and double-quoted string values
- Escaped quotes in string values
- SQL
replace()
function calls - Multi-line table definitions
- Foreign key constraints (ignored during parsing)
- Date/timestamp columns for filtering (supports various date formats)
Dependencies
csv
- CSV file reading and writingregex
- Regular expression pattern matchinganyhow
- Error handlingrayon
- Parallel processing for improved performancechrono
- Date and time parsing for date filtering
Testing
Run the test suite:
The project includes unit tests for:
- Table column parsing
- Value cleaning and unescaping
- CSV value parsing
Error Handling
The tool provides clear error messages for common issues:
- Missing command-line arguments
- File not found
- SQL parsing errors
- CSV writing errors
- Invalid date formats
- Date filter column not found
License
This project is open source. Please check the license file for details.
Contributing
Contributions are welcome! Please feel free to submit issues and pull requests.
Example Output
When processing a SQL file, Table to CSV will output:
Processing SQL file: database.sql
Found table: CallLogs with 9 columns
Found table: CallSession with 9 columns
Found table: d1_migrations with 3 columns
Created calllogs.csv with 35 rows
Created callsession.csv with 89 rows
Created d1_migrations.csv with 3 rows
Conversion complete!
Generated CSV files:
- calllogs.csv
- callsession.csv
- d1_migrations.csv
To view the CSV files, you can use:
cat calllogs.csv | head -5
cat callsession.csv | head -5
Or open them in a spreadsheet application.
Example Output with Date Filtering
When using the date filter feature:
Date filter enabled:
Column: createdAt
Start date: 2024-01-01
End date: 2024-12-31
Processing SQL file: database.sql
Found table: CallLogs with 9 columns
Found table: CallSession with 9 columns
Found table: d1_migrations with 3 columns
Created calllogs.csv with 15 rows
Created callsession.csv with 42 rows
Warning: No rows remain for table 'd1_migrations' after filtering - skipping
Conversion complete!
Generated CSV files:
- calllogs.csv
- callsession.csv
To view the CSV files, you can use:
cat calllogs.csv | head -5
cat callsession.csv | head -5
Or open them in a spreadsheet application.