Struct odbc_api::ColumnarBulkInserter
source · pub struct ColumnarBulkInserter<S, C> { /* private fields */ }Expand description
Can be used to execute a statement with bulk array paramters. Contrary to its name any statement
with parameters can be executed, not only INSERT however inserting large amounts of data in
batches is the primary intended usecase.
Binding new buffers is quite expensive in ODBC, so the parameter buffers are reused for each batch (so the pointers bound to the statment stay valid). So we copy each batch of data into the buffers already bound first rather than binding user defined buffer. Often the data might need to be transformed anyway, so the copy is no actual overhead. Once the buffers are filled with a batch, we send the data.
Implementations§
source§impl<S, C> ColumnarBulkInserter<S, C>where
S: AsStatementRef,
impl<S, C> ColumnarBulkInserter<S, C>where
S: AsStatementRef,
sourcepub unsafe fn new(statement: S, parameters: Vec<C>) -> Result<Self, Error>where
C: ColumnBuffer + HasDataType,
pub unsafe fn new(statement: S, parameters: Vec<C>) -> Result<Self, Error>where
C: ColumnBuffer + HasDataType,
Users are not encouraged to call this directly.
Safety
- Statement is expected to be a perpared statement.
- Parameters must all be valid for insertion. An example for an invalid parameter would be a text buffer with a cell those indiactor value exceeds the maximum element length. This can happen after when truncation occurs then writing into a buffer.
Examples found in repository?
109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252
pub unsafe fn unchecked_bind_columnar_array_parameters<C>(
self,
parameter_buffers: Vec<C>,
) -> Result<ColumnarBulkInserter<S, C>, Error>
where
C: ColumnBuffer + HasDataType,
{
// We know that statement is a prepared statement.
ColumnarBulkInserter::new(self.into_statement(), parameter_buffers)
}
/// Use this to insert rows of string input into the database.
///
/// ```
/// use odbc_api::{Connection, Error};
///
/// fn insert_text<'e>(connection: Connection<'e>) -> Result<(), Error>{
/// // Insert six rows of text with two columns each into the database in batches of 3. In a
/// // real usecase you are likely to achieve a better results with a higher batch size.
///
/// // Note the two `?` used as placeholders for the parameters.
/// let prepared = connection.prepare("INSERT INTO NationalDrink (country, drink) VALUES (?, ?)")?;
/// // We assume both parameter inputs never exceed 50 bytes.
/// let mut prebound = prepared.into_text_inserter(3, [50, 50])?;
///
/// // A cell is an option to byte. We could use `None` to represent NULL but we have no
/// // need to do that in this example.
/// let as_cell = |s: &'static str| { Some(s.as_bytes()) } ;
///
/// // First batch of values
/// prebound.append(["England", "Tea"].into_iter().map(as_cell))?;
/// prebound.append(["Germany", "Beer"].into_iter().map(as_cell))?;
/// prebound.append(["Russia", "Vodka"].into_iter().map(as_cell))?;
///
/// // Execute statement using values bound in buffer.
/// prebound.execute()?;
/// // Clear buffer contents, otherwise the previous values would stay in the buffer.
/// prebound.clear();
///
/// // Second batch of values
/// prebound.append(["India", "Tea"].into_iter().map(as_cell))?;
/// prebound.append(["France", "Wine"].into_iter().map(as_cell))?;
/// prebound.append(["USA", "Cola"].into_iter().map(as_cell))?;
///
/// // Send second batch to the database
/// prebound.execute()?;
///
/// Ok(())
/// }
/// ```
pub fn into_text_inserter(
self,
capacity: usize,
max_str_len: impl IntoIterator<Item = usize>,
) -> Result<ColumnarBulkInserter<S, TextColumn<u8>>, Error> {
let max_str_len = max_str_len.into_iter();
let parameter_buffers = max_str_len
.map(|max_str_len| TextColumn::new(capacity, max_str_len))
.collect();
// Text Columns are created with NULL as default, which is valid for insertion.
unsafe { self.unchecked_bind_columnar_array_parameters(parameter_buffers) }
}
/// A [`crate::ColumnarBulkInserter`] which takes ownership of both the statement and the bound
/// array parameter buffers.
///
/// ```no_run
/// use odbc_api::{Connection, Error, IntoParameter, buffers::BufferDesc};
///
/// fn insert_birth_years(
/// conn: &Connection,
/// names: &[&str],
/// years: &[i16]
/// ) -> Result<(), Error> {
/// // All columns must have equal length.
/// assert_eq!(names.len(), years.len());
///
/// let prepared = conn.prepare("INSERT INTO Birthdays (name, year) VALUES (?, ?)")?;
///
/// // Create a columnar buffer which fits the input parameters.
/// let buffer_description = [
/// BufferDesc::Text { max_str_len: 255 },
/// BufferDesc::I16 { nullable: false },
/// ];
/// // The capacity must be able to hold at least the largest batch. We do everything in one
/// // go, so we set it to the length of the input parameters.
/// let capacity = names.len();
/// // Allocate memory for the array column parameters and bind it to the statement.
/// let mut prebound = prepared.into_column_inserter(capacity, buffer_description)?;
/// // Length of this batch
/// prebound.set_num_rows(capacity);
///
///
/// // Fill the buffer with values column by column
/// let mut col = prebound
/// .column_mut(0)
/// .as_text_view()
/// .expect("We know the name column to hold text.");
///
/// for (index, name) in names.iter().enumerate() {
/// col.set_cell(index, Some(name.as_bytes()));
/// }
///
/// let col = prebound
/// .column_mut(1)
/// .as_slice::<i16>()
/// .expect("We know the year column to hold i16.");
/// col.copy_from_slice(years);
///
/// prebound.execute()?;
/// Ok(())
/// }
/// ```
pub fn into_column_inserter(
self,
capacity: usize,
descriptions: impl IntoIterator<Item = BufferDesc>,
) -> Result<ColumnarBulkInserter<S, AnyBuffer>, Error> {
let parameter_buffers = descriptions
.into_iter()
.map(|desc| AnyBuffer::from_desc(capacity, desc))
.collect();
unsafe { self.unchecked_bind_columnar_array_parameters(parameter_buffers) }
}
/// A [`crate::ColumnarBulkInserter`] which has ownership of the bound array parameter buffers
/// and borrows the statement. For most usecases [`Self::into_any_column_inserter`] is what you
/// want to use, yet on some instances you may want to bind new paramater buffers to the same
/// prepared statement. E.g. to grow the capacity dynamicaly during insertions with several
/// chunks. In such usecases you may only want to borrow the prepared statemnt, so it can be
/// reused with a different set of parameter buffers.
pub fn column_inserter(
&mut self,
capacity: usize,
descriptions: impl IntoIterator<Item = BufferDesc>,
) -> Result<ColumnarBulkInserter<StatementRef<'_>, AnyBuffer>, Error> {
let stmt = self.statement.as_stmt_ref();
let parameter_buffers = descriptions
.into_iter()
.map(|desc| AnyBuffer::from_desc(capacity, desc))
.collect();
unsafe { ColumnarBulkInserter::new(stmt, parameter_buffers) }
}sourcepub fn execute(&mut self) -> Result<Option<CursorImpl<StatementRef<'_>>>, Error>
pub fn execute(&mut self) -> Result<Option<CursorImpl<StatementRef<'_>>>, Error>
Execute the prepared statement, with the parameters bound
sourcepub fn set_num_rows(&mut self, num_rows: usize)
pub fn set_num_rows(&mut self, num_rows: usize)
Set number of valid rows in the buffer. Must not be larger than the batch size. If the specified number than the number of valid rows currently held by the buffer additional they will just hold the value previously assigned to them. Therfore if extending the number of valid rows users should take care to assign values to these rows. However, even if not assigend it is always guaranteed that every cell is valid for insertion and will not cause out of bounds access down in the ODBC driver. Therfore this method is safe. You can set the number of valid rows before or after filling values into the buffer, but you must do so before executing the query.
sourcepub fn column_mut<'a>(&'a mut self, buffer_index: usize) -> C::SliceMutwhere
C: BoundInputSlice<'a>,
pub fn column_mut<'a>(&'a mut self, buffer_index: usize) -> C::SliceMutwhere
C: BoundInputSlice<'a>,
Use this method to gain write access to the actual column data.
Parameters
buffer_index: Please note that the buffer index is not identical to the ODBC column index. For once it is zero based. It also indexes the buffer bound, and not the columns of the output result set. This is important, because not every column needs to be bound. Some columns may simply be ignored. That being said, if every column of the output is bound in the buffer, in the same order in which they are enumerated in the result set, the relationship between column index and buffer index isbuffer_index = column_index - 1.
Example
This method is intend to be called if using ColumnarBulkInserter for column wise bulk
inserts.
use odbc_api::{Connection, Error, buffers::BufferDesc};
fn insert_birth_years(conn: &Connection, names: &[&str], years: &[i16])
-> Result<(), Error>
{
// All columns must have equal length.
assert_eq!(names.len(), years.len());
// Prepare the insert statement
let prepared = conn.prepare("INSERT INTO Birthdays (name, year) VALUES (?, ?)")?;
// Create a columnar buffer which fits the input parameters.
let buffer_description = [
BufferDesc::Text { max_str_len: 255 },
BufferDesc::I16 { nullable: false },
];
// Here we do everything in one batch. So the capacity is the number of input
// parameters.
let capacity = names.len();
let mut prebound = prepared.into_column_inserter(capacity, buffer_description)?;
// Set number of input rows in the current batch.
prebound.set_num_rows(names.len());
// Fill the buffer with values column by column
// Fill names
let mut col = prebound
.column_mut(0)
.as_text_view()
.expect("We know the name column to hold text.");
for (index, name) in names.iter().map(|s| Some(s.as_bytes())).enumerate() {
col.set_cell(index, name);
}
// Fill birth years
let mut col = prebound
.column_mut(1)
.as_slice::<i16>()
.expect("We know the year column to hold i16.");
col.copy_from_slice(years);
// Execute the prepared statment with the bound array parameters. Sending the values to
// the database.
prebound.execute()?;
Ok(())
}source§impl<S> ColumnarBulkInserter<S, TextColumn<u8>>
impl<S> ColumnarBulkInserter<S, TextColumn<u8>>
sourcepub fn append<'b>(
&mut self,
row: impl Iterator<Item = Option<&'b [u8]>>
) -> Result<(), Error>where
S: AsStatementRef,
pub fn append<'b>(
&mut self,
row: impl Iterator<Item = Option<&'b [u8]>>
) -> Result<(), Error>where
S: AsStatementRef,
Takes one element from the iterator for each internal column buffer and appends it to the
end of the buffer. Should a cell of the row be too large for the associated column buffer,
the column buffer will be reallocated with 1.2 times its size, and rebound to the
statement.
This method panics if it is tried to insert elements beyond batch size. It will also panic if row does not contain at least one item for each internal column buffer.