[][src]Crate secret_integers

This crate defines simple wrappers around Rust's integer type to guarantee they are used in a constant-time fashion. Hence, division and direct comparison of these "secret" integers is disallowed.

These integers are intended to be the go-to type to use when implementing cryptographic software, as they provide an extra automated check against use of variable-time operations.

To use the crate, just import everything (use secret_integers::*;) and replace your integer types with uppercase versions of their names (e.g. u8 -> U8).


In order to print information or test code involving your secret integers, you need first to declassify them. Your crypto code should not contain any declassify occurence though to guarantee constant-timedness. Make sure to specify the type of your literals when classifying (e.g. 0x36u16) or else you'll get a casting error.

let x = U32::classify(1u32);
let y : U32 = 2u32.into();
assert_eq!((x + y).declassify(), 3);

Using an illegal operation will get you a compile-time error:

This example deliberately fails to compile
let x = U32::classify(4u32);
let y : U32 = 2u32.into();
assert_eq!((x / y).declassify(), 2);

Since indexing arrays and vectors is only possible with usize, these secret integers also prevent you from using secret values to index memory (which is a breach to constant-timedness due to cache behaviour).

fn xor_block(block1: &mut [U64;16], block2: &[U64;16]) {
   for i in 0..16 {
     block1[i] ^= block2[i]

See the Dalek and Chacha20 examples for more details on how to use this crate.


Because stable Rust does not allow constant functions for now, it is impossible to use those wrappers in const declarations. Even classifying directly inside the declaration does not work:

This example deliberately fails to compile
const IV : [U32;2] = [U32::classify(0xbe6548u32),U32::classify(0xaec6d48u32)]

For now, the solution is to map your const items with classify once you're inside a function, or call into.

const IV : [u32;2] = [0xbe6548, 0xaec6d48];

fn start_cipher(plain: &mut Vec<U32>) {
   for i in 0..plain.len() {
     plain[i] = plain[i] | (plain[i] ^ IV[i].into());