Expand description
§GOOD-ORMNING
Good-ormning is lightweight end-to-end database management with full static type checking! Do all your development in Rust (no live test database), and know that it’ll work in production always.
Dynamic queries are not currently supported. If you want to assemble a query programmatically you can run it against your database connection directly.
§Example
Create this build.rs file:
use good_ormning::sqlite::{
Version,
schema::field::*,
generate,
GenerateArgs,
};
fn main() {
println!("cargo:rerun-if-changed=build.rs");
let latest_version = Version::new();
let users = latest_version.table("users");
users.rowid_field(None);
users.field("name", field_str().build());
users.field("points", field_i64().build());
generate(GenerateArgs {
versions: vec![
// Versions
(1usize, latest_version.build())
],
..Default::default()
}).unwrap();
}generate will save the version type info in $OUT_DIR for use in the proc macros, and generates code to perform database migrations.
You can also programmatically assemble queries using ast objects and pass them in to GenerateArgs to have it turn them into functions in the created module.
Use the database with:
use good_ormning::good_module;
use good_ormning::sqlite::good_query;
fn main() {
good_module!(dbm);
let mut db = rusqlite::Connection::open_in_memory().unwrap();
dbm::migrate(&mut db, None).unwrap();
good_query!("insert into users (name, points) values ($name, $points)"; dbm::Db(&mut db), name: string = "rust human", points: i64 = 0).unwrap();
let users = good_ormning::sqlite::good_query_many!("select name, points from users"; dbm::Db(&mut db)).unwrap();
for user in users {
println!("User: {}, Points: {}", user.name, user.points);
}
}migrate’s second parameter is a callback which is called after each migration, so you can run custom fixup code in migrations.
Output:
User: rust human, Points: 0§Supported databases
- PostgreSQL (feature
pg) viatokio-postgres - Sqlite (feature
sqlite) viarusqlite
I think these are both mostly implemented, but if there’s a missing language feature you need let me know and I’ll try to prioritize it!
§Getting started
§First time
-
You’ll need the following runtime dependencies:
good-ormningtokio-postgresfor PostgreSQLrusqlitefor Sqlite
And
build.rsdependencies:good-ormning
And you must enable one (or more) of the database features:
pgsqlite
plus maybe
chronoorjiffforDateTimesupport. -
Create a
build.rsand define your initial schema version usingVersion::new(). -
Call
goodormning::generate()to output the generated code -
In your runtime code, call
good_module!(dbm)to include the generated code. -
After creating a database connection, call
dbm::migrate(&mut db, None) -
Make queries using
good_query!().
§Schema changes
- Copy your previous version schema, leaving the old schema version untouched. Modify the new schema as you wish.
- Pass both the old and new schema versions to
goodormning::generate(), which will generate the new migration statements. - At runtime, the
migratecall will make sure the database is updated to the new schema version.
You can get rid of old schema versions once you know there are no existing databases running that version.
§Usage details
§good_query macros
These macros are used to execute type-checked queries against the database.
They have the format good_query_SUFFIX!([DBNAME: string,] [VERSION: usize,] SQL: string, CONN, (PARAM: TYPE = VALUE,)...)
-
SUFFIX- This determines the return type.-
No suffix, no return
-
_one- Query will always return one row, or an error -
_maybe- Query will return one or zero rows, or an error. ReturnsOption<> -
_many- Query will return any number of results. ReturnsVec<>
-
-
DBNAME- Optional. If you provided a name inbuild.rstogenerate, use the same name here. For when you have multiple databases. -
VERSION- Optional. Which schema version to execute the query against. You should only need this when running migration post-version code in the callback inmigrate(). -
SQL- The literal SQL query you want to execute. This will be parsed and used to do type checking and return type generation. -
CONN- The database connection -
PARAM: TYPE = VALUE- The parameter values and their types (because the proc macro doesn’t receive type information…).TYPEtakes the format[arr] [opt] type.typecan be any custom type name, or:i16,i32,i64,u32,f32,f64boolstringbytesutctime_s_chrono,utctime_ms_chronoutctime_s_jiff,utctime_ms_jiffauto
Parameters can also be provided inline in the SQL string using ${type = value} syntax.
Example:
good_query!("insert into users (name, points) values (${string = \"rust human\"}, ${i64 = 0})"; dbm::Db(&mut db)).unwrap();§Features
pg- enables generating code for PostgreSQLsqlite- enables generating code for Sqlitechrono- enable datetime field/expression types
§A few words on the future
Obviously writing an SQL VM isn’t great. The ideal solution would be for popular databases to expose their type checking routines as libraries so they could be imported into external programs, like how Go publishes reusable ast-parsing and type-checking libraries.
Re-exports§
pub use crate::pg::GenerateArgs as PgGenerateArgs;pub use crate::sqlite::GenerateArgs as SqliteGenerateArgs;
Modules§
Macros§
- good_
module - Create a module containing the generated code for a database.
Structs§
Enums§
Functions§
- pg_
type_ bool - pg_
type_ bytes - pg_
type_ f32 - pg_
type_ f64 - pg_
type_ i32 - pg_
type_ i64 - pg_
type_ str - pg_
type_ u32 - pg_
type_ utctime_ s_ chrono - pg_
type_ utctime_ s_ jiff - sqlite_
type_ bool - sqlite_
type_ bytes - sqlite_
type_ f32 - sqlite_
type_ f64 - sqlite_
type_ i32 - sqlite_
type_ i64 - sqlite_
type_ str - sqlite_
type_ u32 - sqlite_
type_ utctime_ s_ chrono - sqlite_
type_ utctime_ s_ jiff