1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
//! A high-level, unified API for the Nova-based Proof-of-Retrievability system.
//!
//! This module provides a single, consistent interface for both single-file and multi-file proofs,
//! abstracting away the complexities of the underlying SNARK operations.
//!
//! ## Core Workflow
//!
//! 1. **`prepare_file()`**: Processes raw data into a `PreparedFile` (private to the prover)
//! and a `FileMetadata` object (public commitment). Data is erasure coded, concatenated,
//! chunked, and built into a Merkle tree.
//! 2. **`Challenge::new()`**: Creates a challenge object specifying the file to be proven,
//! the number of proof iterations, deterministic seed, and prover ID.
//! 3. **`prove()`**: Generates a succinct `Proof` for one or more file challenges.
//! Supports any number of files with automatic padding to the next power of two.
//! Challenges may have different seeds (enables multi-batch aggregation).
//! 4. **`verify()`**: Verifies the `Proof` against the public `FileMetadata` and `Challenge`.
//!
//! ## Security Considerations
//!
//! - **Challenge Generation**: Leaf indices are derived from a hash by taking the least significant
//! bits corresponding to the tree depth. This method ensures a uniform, unbiased distribution
//! of challenges across all leaves.
//! - **Private Data**: The `PreparedFile` struct contains the entire Merkle tree and should be
//! treated as sensitive, private data by the prover. Leaking it would compromise the zero-knowledge
//! property of the proof, as it reveals the underlying data.
//! - **Chunk Size**: The `chunk_size` in `prepare_file` must be ≤ 31 bytes, which ensures
//! each chunk can be represented as a single field element. This constraint is enforced
//! by the API.
//!
//! ## Example
//!
//! A complete example demonstrating the API workflow:
//!
//! ```rust,no_run
//! use kontor_crypto::api::{
//! prepare_file, Challenge, FieldElement, PorSystem,
//! tree_depth_from_metadata,
//! };
//! use kontor_crypto::FileLedger;
//!
//! // 1. Prepare the file
//! let my_data = b"This is a test file for the PoR system.";
//! let (prepared_file, metadata) = prepare_file(my_data, "test.dat").unwrap();
//!
//! // 2. Create ledger and add the file
//! let mut ledger = FileLedger::new();
//! ledger.add_file(metadata.file_id.clone(), metadata.root, tree_depth_from_metadata(&metadata)).unwrap();
//!
//! // 3. Create PorSystem and challenge
//! let system = PorSystem::new(&ledger);
//! let num_challenges = 5;
//! let seed = FieldElement::from(12345u64); // Deterministic seed
//! let challenge = Challenge::new(metadata.clone(), 1000, num_challenges, seed, String::from("node_1"));
//!
//! // 4. Generate proof using the unified API
//! let files = vec![&prepared_file];
//! let proof = system.prove(files, &[challenge.clone()]).unwrap();
//!
//! // 5. Verify the proof
//! let is_valid = system.verify(&proof, &[challenge]).unwrap();
//! assert!(is_valid, "Proof verification failed!");
//!
//! println!("Proof successfully generated and verified with Nova PoR API.");
//! ```
// Declare sub-modules
// Re-export the public API
pub use PorSystem;
pub use ;
// Internal modules can access these for implementation
// Export for testing - these are implementation details
pub use verify as verify_raw;
pub use generate_circuit_witness;
// Re-export key external types for easier access.
pub use crate::;
// Local imports for utility functions
use cratebuild_tree;
use ;
use debug_span;
/// Processes raw data into a `PreparedFile` (private) and `FileMetadata` (public).
/// This function applies erasure coding, concatenates shards, chunks the result into
/// fixed-size pieces (config::CHUNK_SIZE_BYTES), and builds a Merkle tree whose
/// leaves are the Poseidon commitments of those pieces.
///
/// Note: This is also available as `PorSystem::prepare_file()` method. The free function
/// is provided for cases where you need to prepare files before creating the ledger.
///
/// # Arguments
///
/// * `data` - The raw data to be processed
/// * `erasure_config` - Configuration for Reed-Solomon erasure coding
/// * `filename` - Filename for operator UX and integration
///
/// # Returns
///
/// Returns a tuple of `(PreparedFile, FileMetadata)` where:
/// - `PreparedFile` contains the private Merkle tree for the prover
/// - `FileMetadata` contains the public commitment and reconstruction information
/// Computes the Merkle tree depth implied by `FileMetadata`.
///
/// Depth is defined as the number of sibling steps from a leaf to the root.
/// For `padded_len` leaves, `depth = log2(padded_len)`.
/// Reconstructs the original file from erasure-coded shards.
///
/// # Arguments
///
/// * `shards` - Vector of shards where `None` represents missing shards
/// * `metadata` - The file metadata containing erasure config and size information
///
/// # Returns
///
/// Returns `Ok(Vec<u8>)` containing the original file data, or an error if:
/// - Too many shards are missing for reconstruction
/// - The reconstructed data is invalid
/// - The metadata is inconsistent
///
/// # Example
///
/// ```rust,no_run
/// use kontor_crypto::api::{prepare_file, reconstruct_file};
///
/// let data = b"Hello, world!";
/// let (prepared_file, metadata) = prepare_file(data, "example.dat").unwrap();
///
/// // Simulate having some symbols with some missing (for reconstruction testing)
/// // In practice, you'd get these from the prepared_file or from storage
/// // Each symbol is a 31-byte chunk of the Reed-Solomon encoded data
/// let total_symbols = metadata.total_symbols();
/// let mut symbols: Vec<Option<Vec<u8>>> = (0..total_symbols)
/// .map(|_| Some(vec![0u8; 31])) // Placeholder symbols
/// .collect();
/// symbols[0] = None; // Simulate missing first symbol
///
/// let reconstructed = reconstruct_file(&symbols, &metadata).unwrap();
/// // Note: This example uses placeholder data, so reconstructed won't match original
/// ```