Crate arrow_odbc

source ·
Expand description

Fill Apache Arrow arrays from ODBC data sources.

Usage

use arrow_odbc::{odbc_api::{Environment, ConnectionOptions}, OdbcReader};

const CONNECTION_STRING: &str = "\
    Driver={ODBC Driver 17 for SQL Server};\
    Server=localhost;\
    UID=SA;\
    PWD=My@Test@Password1;\
";

fn main() -> Result<(), anyhow::Error> {
    // Your application is fine if you spin up only one Environment.
    let odbc_environment = Environment::new()?;
     
    // Connect with database.
    let connection = odbc_environment.connect_with_connection_string(
        CONNECTION_STRING,
        ConnectionOptions::default()
    )?;

    // This SQL statement does not require any arguments.
    let parameters = ();

    // Execute query and create result set
    let cursor = connection
        .execute("SELECT * FROM MyTable", parameters)?
        .expect("SELECT statement must produce a cursor");

    // Each batch shall only consist of maximum 10.000 rows.
    let max_batch_size = 10_000;

    // Read result set as arrow batches. Infer Arrow types automatically using the meta
    // information of `cursor`.
    let arrow_record_batches = OdbcReader::new(cursor, max_batch_size)?;

    for batch in arrow_record_batches {
        // ... process batch ...
    }

    Ok(())
}


Re-exports

Structs

  • Allows setting limits for buffers bound to the ODBC data source. Check this out if you find that you get memory allocation, or zero sized column errors. Used than constructing a reader using crate::OdbcReader::with.
  • Arrow ODBC reader. Implements the [arrow::record_batch::RecordBatchReader] trait so it can be used to fill Arrow arrays from an ODBC data source.
  • Inserts batches from an [arrow::record_batch::RecordBatchReader] into a database.

Enums

Functions

  • Query the metadata to create an arrow schema. This method is invoked automatically for you by crate::OdbcReader::new. You may want to call this method in situtation ther you want to create an arrow schema without creating the reader yet.
  • Fastest and most convinient way to stream the contents of arrow record batches into a database table. For usecase there you want to insert repeatedly into the same table from different streams it is more efficient to create an instance of self::OdbcWriter and reuse it.
  • Creates an SQL insert statement from an arrow schema. The resulting statement will have one placeholer (?) for each column in the statement.