pub trait StructSerializable: Sized {
// Required methods
fn to_serializer(&self) -> StructSerializer;
fn from_deserializer(
deserializer: &mut StructDeserializer,
) -> SerializationResult<Self>;
// Provided methods
fn save_json<P: AsRef<Path>>(&self, path: P) -> SerializationResult<()> { ... }
fn save_binary<P: AsRef<Path>>(&self, path: P) -> SerializationResult<()> { ... }
fn load_json<P: AsRef<Path>>(path: P) -> SerializationResult<Self> { ... }
fn load_binary<P: AsRef<Path>>(path: P) -> SerializationResult<Self> { ... }
fn to_json(&self) -> SerializationResult<String> { ... }
fn to_binary(&self) -> SerializationResult<Vec<u8>> { ... }
fn from_json(json: &str) -> SerializationResult<Self> { ... }
fn from_binary(data: &[u8]) -> SerializationResult<Self> { ... }
}Expand description
Trait for structs that can be serialized and deserialized using the struct builder pattern
This trait provides a complete serialization and deserialization interface for structs, offering both manual field-by-field control and convenient file I/O operations. It serves as a zero-dependency alternative to serde’s derive macros, allowing structs to be easily serialized and deserialized with minimal boilerplate.
§Purpose
The StructSerializable trait provides:
- Complete serialization workflow: From struct to multiple output formats
- File I/O operations: Direct save/load to JSON and binary files
- String/binary conversion: In-memory serialization and deserialization
- Type safety: Compile-time guarantees for struct serialization
- Error handling: Comprehensive error reporting for all operations
- Zero dependencies: Pure Rust implementation without external crates
§Required Methods
to_serializer- Convert the struct to aStructSerializerfrom_deserializer- Create the struct from aStructDeserializer
§Provided Methods
save_json- Save the struct to a JSON filesave_binary- Save the struct to a binary fileload_json- Load the struct from a JSON fileload_binary- Load the struct from a binary fileto_json- Convert the struct to a JSON stringto_binary- Convert the struct to binary datafrom_json- Create the struct from a JSON stringfrom_binary- Create the struct from binary data
§Examples
The trait provides comprehensive serialization capabilities with file I/O and format conversion.
§Implementors
Common types that implement this trait include:
- Configuration structs: Settings, config files, parameters
- Data models: Business logic entities, domain objects
- Serializable objects: Any struct requiring persistence
- Custom types: User-defined serializable structures
§Thread Safety
Implementations should be thread-safe. The trait methods should not modify any shared state and should be safe to call concurrently.
Required Methods§
Sourcefn to_serializer(&self) -> StructSerializer
fn to_serializer(&self) -> StructSerializer
Converts the struct to a StructSerializer
This method should create a StructSerializer and register all fields
that should be serialized. The implementation should use the fluent
interface to build the serializer with all relevant fields.
§Returns
A StructSerializer containing all the struct’s serializable fields
§Examples
Creates a serializer with all struct fields using the fluent interface.
Sourcefn from_deserializer(
deserializer: &mut StructDeserializer,
) -> SerializationResult<Self>
fn from_deserializer( deserializer: &mut StructDeserializer, ) -> SerializationResult<Self>
Creates the struct from a StructDeserializer
This method should extract all fields from the deserializer and construct a new instance of the struct. It should handle field extraction errors gracefully and provide meaningful error messages.
§Arguments
deserializer- The deserializer containing the struct’s field data
§Returns
Ok(Self) on successful struct construction
Err(SerializationError) if field extraction or construction fails
§Examples
Extracts fields and constructs the struct with proper error handling.
Provided Methods§
Sourcefn save_json<P: AsRef<Path>>(&self, path: P) -> SerializationResult<()>
fn save_json<P: AsRef<Path>>(&self, path: P) -> SerializationResult<()>
Saves the struct to a JSON file
Serializes the struct to JSON format and writes it to the specified file path. The file is created if it doesn’t exist, or truncated if it already exists.
§Arguments
path- File path where the JSON data should be written
§Returns
Ok(()) on successful file write
Err(SerializationError) if serialization or file I/O fails
§Examples
Saves struct data to a JSON file with proper file I/O handling.
Examples found in repository?
89 pub fn save_json(&self, path: &str) -> Result<(), Box<dyn std::error::Error>> {
90 // Create directory if it doesn't exist
91 if let Some(parent) = std::path::Path::new(path).parent() {
92 fs::create_dir_all(parent)?;
93 }
94
95 let weight_path = format!("{}_weight.json", path);
96 let bias_path = format!("{}_bias.json", path);
97
98 self.weight.save_json(&weight_path)?;
99 self.bias.save_json(&bias_path)?;
100
101 println!("Saved linear layer to {} (weight and bias)", path);
102 Ok(())
103 }More examples
204 pub fn save_json(&self, path: &str) -> Result<(), Box<dyn std::error::Error>> {
205 if let Some(parent) = std::path::Path::new(path).parent() {
206 fs::create_dir_all(parent)?;
207 }
208
209 for (i, layer) in self.layers.iter().enumerate() {
210 let layer_path = format!("{}_layer_{}", path, i);
211 let weight_path = format!("{}_weight.json", layer_path);
212 let bias_path = format!("{}_bias.json", layer_path);
213
214 layer.weight.save_json(&weight_path)?;
215 layer.bias.save_json(&bias_path)?;
216 }
217
218 println!(
219 "Saved feed-forward network to {} ({} layers)",
220 path,
221 self.layers.len()
222 );
223 Ok(())
224 }224fn demonstrate_user_profile_serialization() -> Result<(), Box<dyn std::error::Error>> {
225 println!("--- User Profile Serialization ---");
226
227 // Create a user profile with various field types
228 let user = UserProfile {
229 id: 12345,
230 username: "alice_cooper".to_string(),
231 email: "alice@example.com".to_string(),
232 age: 28,
233 is_active: true,
234 score: 95.7,
235 };
236
237 println!("Original user profile:");
238 println!(" ID: {}", user.id);
239 println!(" Username: {}", user.username);
240 println!(" Email: {}", user.email);
241 println!(" Age: {}", user.age);
242 println!(" Active: {}", user.is_active);
243 println!(" Score: {}", user.score);
244
245 // Serialize to JSON
246 let json_data = user.to_json()?;
247 println!("\nSerialized to JSON:");
248 println!("{}", json_data);
249
250 // Save to JSON file
251 user.save_json("temp_user_profile.json")?;
252 println!("Saved to file: temp_user_profile.json");
253
254 // Load from JSON file
255 let loaded_user = UserProfile::load_json("temp_user_profile.json")?;
256 println!("\nLoaded user profile:");
257 println!(" ID: {}", loaded_user.id);
258 println!(" Username: {}", loaded_user.username);
259 println!(" Email: {}", loaded_user.email);
260 println!(" Age: {}", loaded_user.age);
261 println!(" Active: {}", loaded_user.is_active);
262 println!(" Score: {}", loaded_user.score);
263
264 // Verify data integrity
265 assert_eq!(user, loaded_user);
266 println!("Data integrity verification: PASSED");
267
268 Ok(())
269}
270
271/// Demonstrate serialization with collections and optional fields
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}52fn demonstrate_tensor_serialization() -> Result<(), Box<dyn std::error::Error>> {
53 println!("--- Tensor Serialization ---");
54
55 // Create a tensor with some data
56 let original_tensor = Tensor::from_slice(&[1.0, 2.0, 3.0, 4.0, 5.0, 6.0], vec![2, 3]).unwrap();
57 println!(
58 "Original tensor: shape {:?}, data: {:?}",
59 original_tensor.shape().dims,
60 original_tensor.data()
61 );
62
63 // Save tensor in JSON format
64 let json_path = "temp_tensor.json";
65 original_tensor.save_json(json_path)?;
66 println!("Saved tensor to JSON: {}", json_path);
67
68 // Load tensor from JSON
69 let loaded_tensor_json = Tensor::load_json(json_path)?;
70 println!(
71 "Loaded from JSON: shape {:?}, data: {:?}",
72 loaded_tensor_json.shape().dims,
73 loaded_tensor_json.data()
74 );
75
76 // Verify data integrity
77 assert_eq!(
78 original_tensor.shape().dims,
79 loaded_tensor_json.shape().dims
80 );
81 assert_eq!(original_tensor.data(), loaded_tensor_json.data());
82 println!("JSON serialization verification: PASSED");
83
84 // Save tensor in binary format
85 let binary_path = "temp_tensor.bin";
86 original_tensor.save_binary(binary_path)?;
87 println!("Saved tensor to binary: {}", binary_path);
88
89 // Load tensor from binary
90 let loaded_tensor_binary = Tensor::load_binary(binary_path)?;
91 println!(
92 "Loaded from binary: shape {:?}, data: {:?}",
93 loaded_tensor_binary.shape().dims,
94 loaded_tensor_binary.data()
95 );
96
97 // Verify data integrity
98 assert_eq!(
99 original_tensor.shape().dims,
100 loaded_tensor_binary.shape().dims
101 );
102 assert_eq!(original_tensor.data(), loaded_tensor_binary.data());
103 println!("Binary serialization verification: PASSED");
104
105 Ok(())
106}
107
108/// Demonstrate optimizer serialization and deserialization
109fn demonstrate_optimizer_serialization() -> Result<(), Box<dyn std::error::Error>> {
110 println!("\n--- Optimizer Serialization ---");
111
112 // Create an optimizer with some parameters
113 let mut weight = Tensor::randn(vec![2, 2], Some(42)).with_requires_grad();
114 let mut bias = Tensor::randn(vec![2], Some(43)).with_requires_grad();
115
116 let config = AdamConfig {
117 learning_rate: 0.001,
118 beta1: 0.9,
119 beta2: 0.999,
120 eps: 1e-8,
121 weight_decay: 0.0,
122 amsgrad: false,
123 };
124
125 let mut optimizer = Adam::with_config(config);
126 optimizer.add_parameter(&weight);
127 optimizer.add_parameter(&bias);
128
129 println!(
130 "Created optimizer with {} parameters",
131 optimizer.parameter_count()
132 );
133 println!("Learning rate: {}", optimizer.learning_rate());
134
135 // Simulate some training steps
136 for _ in 0..3 {
137 let mut loss = weight.sum() + bias.sum();
138 loss.backward(None);
139 optimizer.step(&mut [&mut weight, &mut bias]);
140 optimizer.zero_grad(&mut [&mut weight, &mut bias]);
141 }
142
143 // Save optimizer state
144 let optimizer_path = "temp_optimizer.json";
145 optimizer.save_json(optimizer_path)?;
146 println!("Saved optimizer to: {}", optimizer_path);
147
148 // Load optimizer state
149 let loaded_optimizer = Adam::load_json(optimizer_path)?;
150 println!(
151 "Loaded optimizer with {} parameters",
152 loaded_optimizer.parameter_count()
153 );
154 println!("Learning rate: {}", loaded_optimizer.learning_rate());
155
156 // Verify optimizer state
157 assert_eq!(
158 optimizer.parameter_count(),
159 loaded_optimizer.parameter_count()
160 );
161 assert_eq!(optimizer.learning_rate(), loaded_optimizer.learning_rate());
162 println!("Optimizer serialization verification: PASSED");
163
164 Ok(())
165}
166
167/// Demonstrate format comparison and performance characteristics
168fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
169 println!("\n--- Format Comparison ---");
170
171 // Create a larger tensor for comparison
172 let tensor = Tensor::randn(vec![10, 10], Some(44));
173
174 // Save in both formats
175 tensor.save_json("temp_comparison.json")?;
176 tensor.save_binary("temp_comparison.bin")?;
177
178 // Compare file sizes
179 let json_size = fs::metadata("temp_comparison.json")?.len();
180 let binary_size = fs::metadata("temp_comparison.bin")?.len();
181
182 println!("JSON file size: {} bytes", json_size);
183 println!("Binary file size: {} bytes", binary_size);
184 println!(
185 "Compression ratio: {:.2}x",
186 json_size as f64 / binary_size as f64
187 );
188
189 // Load and verify both formats
190 let json_tensor = Tensor::load_json("temp_comparison.json")?;
191 let binary_tensor = Tensor::load_binary("temp_comparison.bin")?;
192
193 assert_eq!(tensor.shape().dims, json_tensor.shape().dims);
194 assert_eq!(tensor.shape().dims, binary_tensor.shape().dims);
195 assert_eq!(tensor.data(), json_tensor.data());
196 assert_eq!(tensor.data(), binary_tensor.data());
197
198 println!("Format comparison verification: PASSED");
199
200 Ok(())
201}
202
203/// Demonstrate a basic model checkpointing workflow
204fn demonstrate_model_checkpointing() -> Result<(), Box<dyn std::error::Error>> {
205 println!("\n--- Model Checkpointing ---");
206
207 // Create a simple model (weights and bias)
208 let mut weights = Tensor::randn(vec![2, 1], Some(45)).with_requires_grad();
209 let mut bias = Tensor::randn(vec![1], Some(46)).with_requires_grad();
210
211 // Create optimizer
212 let mut optimizer = Adam::with_learning_rate(0.01);
213 optimizer.add_parameter(&weights);
214 optimizer.add_parameter(&bias);
215
216 println!("Initial weights: {:?}", weights.data());
217 println!("Initial bias: {:?}", bias.data());
218
219 // Simulate training
220 for epoch in 0..5 {
221 let mut loss = weights.sum() + bias.sum();
222 loss.backward(None);
223 optimizer.step(&mut [&mut weights, &mut bias]);
224 optimizer.zero_grad(&mut [&mut weights, &mut bias]);
225
226 if epoch % 2 == 0 {
227 // Save checkpoint
228 let checkpoint_dir = format!("checkpoint_epoch_{}", epoch);
229 fs::create_dir_all(&checkpoint_dir)?;
230
231 weights.save_json(format!("{}/weights.json", checkpoint_dir))?;
232 bias.save_json(format!("{}/bias.json", checkpoint_dir))?;
233 optimizer.save_json(format!("{}/optimizer.json", checkpoint_dir))?;
234
235 println!("Saved checkpoint for epoch {}", epoch);
236 }
237 }
238
239 // Load from checkpoint
240 let loaded_weights = Tensor::load_json("checkpoint_epoch_4/weights.json")?;
241 let loaded_bias = Tensor::load_json("checkpoint_epoch_4/bias.json")?;
242 let loaded_optimizer = Adam::load_json("checkpoint_epoch_4/optimizer.json")?;
243
244 println!("Loaded weights: {:?}", loaded_weights.data());
245 println!("Loaded bias: {:?}", loaded_bias.data());
246 println!(
247 "Loaded optimizer learning rate: {}",
248 loaded_optimizer.learning_rate()
249 );
250
251 // Verify checkpoint integrity
252 assert_eq!(weights.shape().dims, loaded_weights.shape().dims);
253 assert_eq!(bias.shape().dims, loaded_bias.shape().dims);
254 assert_eq!(optimizer.learning_rate(), loaded_optimizer.learning_rate());
255
256 println!("Checkpointing verification: PASSED");
257
258 Ok(())
259}533fn demonstrate_schema_evolution() -> Result<(), Box<dyn std::error::Error>> {
534 println!("\n--- Schema Evolution Patterns ---");
535
536 fs::create_dir_all("temp_schema_tests")?;
537
538 // Create data with different schema versions
539 println!("Creating data with different schema versions:");
540
541 // Version 1 data (minimal)
542 let v1_json = r#"{
543 "version": 1,
544 "name": "legacy_data",
545 "value": 123.45
546 }"#;
547 fs::write("temp_schema_tests/v1_data.json", v1_json)?;
548 println!(" ✓ Version 1 data created (minimal fields)");
549
550 // Version 2 data (with optional field)
551 let v2_json = r#"{
552 "version": 2,
553 "name": "v2_data",
554 "value": 678.90,
555 "optional_field": "added_in_v2"
556 }"#;
557 fs::write("temp_schema_tests/v2_data.json", v2_json)?;
558 println!(" ✓ Version 2 data created (with optional field)");
559
560 // Version 3 data (with all fields)
561 let v3_data = VersionedData {
562 version: 3,
563 name: "v3_data".to_string(),
564 value: 999.99,
565 optional_field: Some("present".to_string()),
566 new_field: Some(42),
567 };
568 v3_data.save_json("temp_schema_tests/v3_data.json")?;
569 println!(" ✓ Version 3 data created (all fields)");
570
571 // Test backward compatibility
572 println!("\nTesting backward compatibility:");
573
574 // Load v1 data with current deserializer
575 match VersionedData::load_json("temp_schema_tests/v1_data.json") {
576 Ok(data) => {
577 println!(" ✓ V1 data loaded successfully:");
578 println!(" Name: {}", data.name);
579 println!(" Value: {}", data.value);
580 println!(" Optional field: {:?}", data.optional_field);
581 println!(" New field: {:?}", data.new_field);
582 }
583 Err(e) => println!(" ✗ Failed to load V1 data: {}", e),
584 }
585
586 // Load v2 data with current deserializer
587 match VersionedData::load_json("temp_schema_tests/v2_data.json") {
588 Ok(data) => {
589 println!(" ✓ V2 data loaded successfully:");
590 println!(" Name: {}", data.name);
591 println!(" Value: {}", data.value);
592 println!(" Optional field: {:?}", data.optional_field);
593 println!(" New field: {:?}", data.new_field);
594 }
595 Err(e) => println!(" ✗ Failed to load V2 data: {}", e),
596 }
597
598 // Test future version rejection
599 println!("\nTesting future version handling:");
600 let future_version_json = r#"{
601 "version": 99,
602 "name": "future_data",
603 "value": 123.45,
604 "unknown_field": "should_be_ignored"
605 }"#;
606 fs::write("temp_schema_tests/future_data.json", future_version_json)?;
607
608 match VersionedData::load_json("temp_schema_tests/future_data.json") {
609 Ok(_) => println!(" ✗ Unexpected: Future version was accepted"),
610 Err(e) => println!(" ✓ Expected rejection of future version: {}", e),
611 }
612
613 // Demonstrate migration strategy
614 println!("\nDemonstrating migration strategy:");
615 println!(" Strategy: Load old format, upgrade to new format, save");
616
617 // Simulate migrating v1 data to v3 format
618 let v1_loaded = VersionedData::load_json("temp_schema_tests/v1_data.json")?;
619 let v1_upgraded = VersionedData {
620 version: 3,
621 name: v1_loaded.name,
622 value: v1_loaded.value,
623 optional_field: Some("migrated_default".to_string()),
624 new_field: Some(0),
625 };
626
627 v1_upgraded.save_json("temp_schema_tests/v1_migrated.json")?;
628 println!(" ✓ V1 data migrated to V3 format");
629
630 Ok(())
631}
632
633/// Demonstrate recovery strategies
634fn demonstrate_recovery_strategies() -> Result<(), Box<dyn std::error::Error>> {
635 println!("\n--- Recovery Strategies ---");
636
637 fs::create_dir_all("temp_recovery_tests")?;
638
639 // Strategy 1: Graceful degradation
640 println!("1. Graceful Degradation Strategy:");
641
642 // Create complete data
643 let complete_data = RecoverableData {
644 critical_field: "essential_info".to_string(),
645 important_field: Some("important_info".to_string()),
646 optional_field: Some("nice_to_have".to_string()),
647 metadata: {
648 let mut map = HashMap::new();
649 map.insert("key1".to_string(), "value1".to_string());
650 map.insert("key2".to_string(), "value2".to_string());
651 map
652 },
653 };
654
655 // Save complete data
656 complete_data.save_json("temp_recovery_tests/complete.json")?;
657
658 // Create partial data (missing some fields)
659 let partial_json = r#"{
660 "critical_field": "essential_info",
661 "optional_field": "nice_to_have"
662 }"#;
663 fs::write("temp_recovery_tests/partial.json", partial_json)?;
664
665 // Load partial data and demonstrate recovery
666 match RecoverableData::load_json("temp_recovery_tests/partial.json") {
667 Ok(recovered) => {
668 println!(" ✓ Partial data recovered successfully:");
669 println!(" Critical field: {}", recovered.critical_field);
670 println!(
671 " Important field: {:?} (missing, set to None)",
672 recovered.important_field
673 );
674 println!(" Optional field: {:?}", recovered.optional_field);
675 println!(
676 " Metadata: {} entries (defaulted to empty)",
677 recovered.metadata.len()
678 );
679 }
680 Err(e) => println!(" ✗ Recovery failed: {}", e),
681 }
682
683 // Strategy 2: Error context preservation
684 println!("\n2. Error Context Preservation:");
685
686 let malformed_json = r#"{
687 "critical_field": "essential_info",
688 "important_field": 12345,
689 "metadata": "not_a_map"
690 }"#;
691 fs::write("temp_recovery_tests/malformed.json", malformed_json)?;
692
693 match RecoverableData::load_json("temp_recovery_tests/malformed.json") {
694 Ok(_) => println!(" ✗ Unexpected: Malformed data was accepted"),
695 Err(e) => {
696 println!(" ✓ Error context preserved:");
697 println!(" Error: {}", e);
698 println!(" Error type: {:?}", std::mem::discriminant(&e));
699 }
700 }
701
702 // Strategy 3: Fallback data sources
703 println!("\n3. Fallback Data Sources:");
704
705 // Primary source (corrupted)
706 let corrupted_primary = "corrupted data";
707 fs::write("temp_recovery_tests/primary.json", corrupted_primary)?;
708
709 // Backup source (valid)
710 let backup_data = RecoverableData {
711 critical_field: "backup_critical".to_string(),
712 important_field: Some("backup_important".to_string()),
713 optional_field: None,
714 metadata: HashMap::new(),
715 };
716 backup_data.save_json("temp_recovery_tests/backup.json")?;
717
718 // Default fallback
719 let default_data = RecoverableData {
720 critical_field: "default_critical".to_string(),
721 important_field: None,
722 optional_field: None,
723 metadata: HashMap::new(),
724 };
725
726 println!(" Attempting to load data with fallback chain:");
727
728 // Try primary source
729 let loaded_data = match RecoverableData::load_json("temp_recovery_tests/primary.json") {
730 Ok(data) => {
731 println!(" ✓ Loaded from primary source");
732 data
733 }
734 Err(_) => {
735 println!(" ✗ Primary source failed, trying backup");
736
737 // Try backup source
738 match RecoverableData::load_json("temp_recovery_tests/backup.json") {
739 Ok(data) => {
740 println!(" ✓ Loaded from backup source");
741 data
742 }
743 Err(_) => {
744 println!(" ✗ Backup source failed, using default");
745 default_data
746 }
747 }
748 }
749 };
750
751 println!(" Final loaded data:");
752 println!(" Critical field: {}", loaded_data.critical_field);
753
754 Ok(())
755}
756
757/// Demonstrate production-ready error handling
758fn demonstrate_production_error_handling() -> Result<(), Box<dyn std::error::Error>> {
759 println!("\n--- Production Error Handling ---");
760
761 fs::create_dir_all("temp_production_tests")?;
762
763 // Error logging and monitoring
764 println!("1. Error Logging and Monitoring:");
765
766 let test_data = VersionedData {
767 version: 2,
768 name: "production_test".to_string(),
769 value: 42.0,
770 optional_field: Some("test".to_string()),
771 new_field: None,
772 };
773
774 // Create error log for demonstration
775 let mut error_log = fs::OpenOptions::new()
776 .create(true)
777 .append(true)
778 .open("temp_production_tests/error.log")?;
779
780 // Function to log errors in production format
781 let mut log_error =
782 |error: &SerializationError, context: &str| -> Result<(), Box<dyn std::error::Error>> {
783 let timestamp = std::time::SystemTime::now()
784 .duration_since(std::time::UNIX_EPOCH)?
785 .as_secs();
786
787 writeln!(error_log, "[{}] ERROR in {}: {}", timestamp, context, error)?;
788 Ok(())
789 };
790
791 // Simulate various error scenarios with logging
792 let error_scenarios = vec![
793 ("corrupted_file.json", "invalid json content"),
794 ("missing_fields.json", r#"{"version": 1}"#),
795 (
796 "type_error.json",
797 r#"{"version": "not_number", "name": "test", "value": 42.0}"#,
798 ),
799 ];
800
801 for (filename, content) in error_scenarios {
802 let filepath = format!("temp_production_tests/{}", filename);
803 fs::write(&filepath, content)?;
804
805 match VersionedData::load_json(&filepath) {
806 Ok(_) => println!(" ✗ Unexpected success for {}", filename),
807 Err(e) => {
808 log_error(&e, &format!("load_config({})", filename))?;
809 println!(" ✓ Error logged for {}: {}", filename, e);
810 }
811 }
812 }
813
814 // Health check pattern
815 println!("\n2. Health Check Pattern:");
816
817 let health_check = || -> Result<bool, SerializationError> {
818 // Check if we can serialize/deserialize basic data
819 let test_data = VersionedData {
820 version: 1,
821 name: "health_check".to_string(),
822 value: 1.0,
823 optional_field: None,
824 new_field: None,
825 };
826
827 let serialized = test_data.to_json()?;
828 let _deserialized = VersionedData::from_json(&serialized)?;
829 Ok(true)
830 };
831
832 match health_check() {
833 Ok(_) => println!(" ✓ Serialization system health check passed"),
834 Err(e) => {
835 log_error(&e, "health_check")?;
836 println!(" ✗ Serialization system health check failed: {}", e);
837 }
838 }
839
840 // Circuit breaker pattern simulation
841 println!("\n3. Circuit Breaker Pattern:");
842
843 struct CircuitBreaker {
844 failure_count: u32,
845 failure_threshold: u32,
846 is_open: bool,
847 }
848
849 impl CircuitBreaker {
850 fn new(threshold: u32) -> Self {
851 Self {
852 failure_count: 0,
853 failure_threshold: threshold,
854 is_open: false,
855 }
856 }
857
858 fn call<F, T>(&mut self, operation: F) -> Result<T, String>
859 where
860 F: FnOnce() -> Result<T, SerializationError>,
861 {
862 if self.is_open {
863 return Err("Circuit breaker is open".to_string());
864 }
865
866 match operation() {
867 Ok(result) => {
868 self.failure_count = 0; // Reset on success
869 Ok(result)
870 }
871 Err(e) => {
872 self.failure_count += 1;
873 if self.failure_count >= self.failure_threshold {
874 self.is_open = true;
875 println!(
876 " Circuit breaker opened after {} failures",
877 self.failure_count
878 );
879 }
880 Err(e.to_string())
881 }
882 }
883 }
884 }
885
886 let mut circuit_breaker = CircuitBreaker::new(3);
887
888 // Simulate operations that fail
889 for i in 1..=5 {
890 let result = circuit_breaker
891 .call(|| VersionedData::load_json("temp_production_tests/corrupted_file.json"));
892
893 match result {
894 Ok(_) => println!(" Operation {} succeeded", i),
895 Err(e) => println!(" Operation {} failed: {}", i, e),
896 }
897 }
898
899 // Retry mechanism
900 println!("\n4. Retry Mechanism:");
901
902 let retry_operation = |max_attempts: u32| -> Result<VersionedData, String> {
903 for attempt in 1..=max_attempts {
904 println!(" Attempt {}/{}", attempt, max_attempts);
905
906 // Try different sources in order
907 let sources = vec![
908 "temp_production_tests/corrupted_file.json",
909 "temp_production_tests/missing_fields.json",
910 "temp_production_tests/backup_valid.json",
911 ];
912
913 if attempt == max_attempts {
914 // On final attempt, create valid backup
915 test_data
916 .save_json("temp_production_tests/backup_valid.json")
917 .map_err(|e| format!("Failed to create backup: {}", e))?;
918 }
919
920 for source in &sources {
921 match VersionedData::load_json(source) {
922 Ok(data) => {
923 println!(" ✓ Succeeded loading from {}", source);
924 return Ok(data);
925 }
926 Err(_) => {
927 println!(" ✗ Failed to load from {}", source);
928 continue;
929 }
930 }
931 }
932
933 if attempt < max_attempts {
934 println!(" Waiting before retry...");
935 // In real code, would sleep here
936 }
937 }
938
939 Err("All retry attempts exhausted".to_string())
940 };
941
942 match retry_operation(3) {
943 Ok(data) => println!(" ✓ Retry succeeded: {}", data.name),
944 Err(e) => println!(" ✗ Retry failed: {}", e),
945 }
946
947 Ok(())
948}630fn demonstrate_use_case_recommendations() -> Result<(), Box<dyn std::error::Error>> {
631 println!("\n--- Use Case Recommendations ---");
632
633 println!("JSON Format - Recommended for:");
634 println!(" ✓ Configuration files (human-editable)");
635 println!(" ✓ API responses (web compatibility)");
636 println!(" ✓ Debugging and development (readability)");
637 println!(" ✓ Small data structures (minimal overhead)");
638 println!(" ✓ Cross-language interoperability");
639 println!(" ✓ Schema evolution (self-describing)");
640 println!(" ✓ Text-heavy data with few numbers");
641
642 println!("\nBinary Format - Recommended for:");
643 println!(" ✓ Large datasets (memory/storage efficiency)");
644 println!(" ✓ High-performance applications (speed critical)");
645 println!(" ✓ Numeric-heavy data (ML models, matrices)");
646 println!(" ✓ Network transmission (bandwidth limited)");
647 println!(" ✓ Embedded systems (resource constrained)");
648 println!(" ✓ Long-term storage (space efficiency)");
649 println!(" ✓ Frequent serialization/deserialization");
650
651 // Demonstrate decision matrix
652 println!("\nDecision Matrix Example:");
653
654 let scenarios = vec![
655 (
656 "Web API Configuration",
657 "JSON",
658 "Human readable, web standard, small size",
659 ),
660 (
661 "ML Model Weights",
662 "Binary",
663 "Large numeric data, performance critical",
664 ),
665 (
666 "User Preferences",
667 "JSON",
668 "Human editable, self-documenting",
669 ),
670 (
671 "Real-time Telemetry",
672 "Binary",
673 "High frequency, bandwidth limited",
674 ),
675 (
676 "Application Settings",
677 "JSON",
678 "Developer accessible, version control friendly",
679 ),
680 (
681 "Scientific Dataset",
682 "Binary",
683 "Large arrays, storage efficiency critical",
684 ),
685 ];
686
687 for (scenario, recommendation, reason) in scenarios {
688 println!(" {} -> {} ({})", scenario, recommendation, reason);
689 }
690
691 // Create examples for common scenarios
692 println!("\nPractical Examples:");
693
694 // Configuration file example (JSON)
695 let config = Configuration {
696 version: "2.1.0".to_string(),
697 debug_enabled: false,
698 log_level: "info".to_string(),
699 database_settings: {
700 let mut map = HashMap::new();
701 map.insert("url".to_string(), "postgresql://localhost/app".to_string());
702 map.insert("pool_size".to_string(), "10".to_string());
703 map
704 },
705 feature_flags_enabled: true,
706 max_connections: 100.0,
707 timeout_seconds: 30.0,
708 };
709
710 config.save_json("temp_config_example.json")?;
711 let config_content = fs::read_to_string("temp_config_example.json")?;
712
713 println!("\nConfiguration File (JSON) - Human readable:");
714 for line in config_content.lines().take(5) {
715 println!(" {}", line);
716 }
717 println!(" ... (easily editable by developers)");
718
719 // Data export example (Binary)
720 let export_data = LargeDataset {
721 name: "Training Export".to_string(),
722 values: (0..1000).map(|i| (i as f32).sin()).collect(),
723 labels: (0..1000).map(|i| format!("sample_{:04}", i)).collect(),
724 feature_count: 50,
725 feature_dimension: 20,
726 timestamp_count: 1000,
727 metadata: HashMap::new(),
728 };
729
730 export_data.save_binary("temp_export_example.bin")?;
731 let export_size = fs::metadata("temp_export_example.bin")?.len();
732
733 println!("\nData Export (Binary) - Efficient storage:");
734 println!(
735 " File size: {} bytes ({:.1} KB)",
736 export_size,
737 export_size as f64 / 1024.0
738 );
739 println!(" 1000 numeric values + 50x20 matrix + metadata");
740 println!(" Compact encoding saves significant space vs JSON");
741
742 Ok(())
743}
744
745/// Demonstrate debugging capabilities
746fn demonstrate_debugging_capabilities() -> Result<(), Box<dyn std::error::Error>> {
747 println!("\n--- Debugging Capabilities ---");
748
749 let mut metadata = HashMap::new();
750 metadata.insert("debug_session".to_string(), "session_123".to_string());
751 metadata.insert("error_code".to_string(), "E001".to_string());
752
753 let debug_metrics = PerformanceMetrics {
754 operation: "debug_test".to_string(),
755 duration_micros: 5432,
756 memory_usage_bytes: 16384,
757 cpu_usage_percent: 42.7,
758 throughput_ops_per_sec: 750.0,
759 metadata,
760 };
761
762 println!("Debugging Comparison:");
763
764 // JSON debugging advantages
765 let json_data = debug_metrics.to_json()?;
766 println!("\nJSON Format - Debugging Advantages:");
767 println!(" ✓ Human readable without tools");
768 println!(" ✓ Can inspect values directly");
769 println!(" ✓ Text editors show structure");
770 println!(" ✓ Diff tools work naturally");
771 println!(" ✓ Version control friendly");
772
773 println!("\n Sample JSON output for debugging:");
774 for (i, line) in json_data.lines().enumerate() {
775 if i < 5 {
776 println!(" {}", line);
777 }
778 }
779
780 // Binary debugging limitations
781 let binary_data = debug_metrics.to_binary()?;
782 println!("\nBinary Format - Debugging Limitations:");
783 println!(" ✗ Requires special tools to inspect");
784 println!(" ✗ Not human readable");
785 println!(" ✗ Difficult to debug data corruption");
786 println!(" ✗ Version control shows as binary diff");
787
788 println!("\n Binary data (hex dump for debugging):");
789 print!(" ");
790 for (i, byte) in binary_data.iter().take(40).enumerate() {
791 if i > 0 && i % 16 == 0 {
792 println!();
793 print!(" ");
794 }
795 print!("{:02x} ", byte);
796 }
797 println!("\n (requires hex editor or custom tools)");
798
799 // Development workflow comparison
800 println!("\nDevelopment Workflow Impact:");
801
802 println!("\nJSON Workflow:");
803 println!(" 1. Save data to JSON file");
804 println!(" 2. Open in any text editor");
805 println!(" 3. Inspect values directly");
806 println!(" 4. Make manual edits if needed");
807 println!(" 5. Version control tracks changes");
808
809 println!("\nBinary Workflow:");
810 println!(" 1. Save data to binary file");
811 println!(" 2. Write debugging code to load and print");
812 println!(" 3. Use hex editor for low-level inspection");
813 println!(" 4. Cannot make manual edits easily");
814 println!(" 5. Version control shows binary changes only");
815
816 // Hybrid approach recommendation
817 println!("\nHybrid Approach for Development:");
818 println!(" - Use JSON during development/debugging");
819 println!(" - Switch to binary for production deployment");
820 println!(" - Provide debugging tools that export binary to JSON");
821 println!(" - Include format conversion utilities");
822
823 // Demonstrate debugging scenario
824 println!("\nDebugging Scenario Example:");
825 println!(" Problem: Performance metrics show unexpected values");
826
827 // Save both formats for comparison
828 debug_metrics.save_json("temp_debug_metrics.json")?;
829 debug_metrics.save_binary("temp_debug_metrics.bin")?;
830
831 println!(" JSON approach: Open temp_debug_metrics.json in editor");
832 println!(" -> Immediately see cpu_usage_percent: 42.7");
833 println!(" -> Compare with expected range");
834 println!(" -> Check metadata for debug_session: 'session_123'");
835
836 println!(" Binary approach: Write debugging code");
837 println!(" -> Load binary file programmatically");
838 println!(" -> Print values to console");
839 println!(" -> Additional development time required");
840
841 Ok(())
842}Sourcefn save_binary<P: AsRef<Path>>(&self, path: P) -> SerializationResult<()>
fn save_binary<P: AsRef<Path>>(&self, path: P) -> SerializationResult<()>
Saves the struct to a binary file
Serializes the struct to binary format and writes it to the specified file path. The file is created if it doesn’t exist, or truncated if it already exists.
§Arguments
path- File path where the binary data should be written
§Returns
Ok(()) on successful file write
Err(SerializationError) if serialization or file I/O fails
§Examples
Saves struct data to a binary file with proper file I/O handling.
Examples found in repository?
52fn demonstrate_tensor_serialization() -> Result<(), Box<dyn std::error::Error>> {
53 println!("--- Tensor Serialization ---");
54
55 // Create a tensor with some data
56 let original_tensor = Tensor::from_slice(&[1.0, 2.0, 3.0, 4.0, 5.0, 6.0], vec![2, 3]).unwrap();
57 println!(
58 "Original tensor: shape {:?}, data: {:?}",
59 original_tensor.shape().dims,
60 original_tensor.data()
61 );
62
63 // Save tensor in JSON format
64 let json_path = "temp_tensor.json";
65 original_tensor.save_json(json_path)?;
66 println!("Saved tensor to JSON: {}", json_path);
67
68 // Load tensor from JSON
69 let loaded_tensor_json = Tensor::load_json(json_path)?;
70 println!(
71 "Loaded from JSON: shape {:?}, data: {:?}",
72 loaded_tensor_json.shape().dims,
73 loaded_tensor_json.data()
74 );
75
76 // Verify data integrity
77 assert_eq!(
78 original_tensor.shape().dims,
79 loaded_tensor_json.shape().dims
80 );
81 assert_eq!(original_tensor.data(), loaded_tensor_json.data());
82 println!("JSON serialization verification: PASSED");
83
84 // Save tensor in binary format
85 let binary_path = "temp_tensor.bin";
86 original_tensor.save_binary(binary_path)?;
87 println!("Saved tensor to binary: {}", binary_path);
88
89 // Load tensor from binary
90 let loaded_tensor_binary = Tensor::load_binary(binary_path)?;
91 println!(
92 "Loaded from binary: shape {:?}, data: {:?}",
93 loaded_tensor_binary.shape().dims,
94 loaded_tensor_binary.data()
95 );
96
97 // Verify data integrity
98 assert_eq!(
99 original_tensor.shape().dims,
100 loaded_tensor_binary.shape().dims
101 );
102 assert_eq!(original_tensor.data(), loaded_tensor_binary.data());
103 println!("Binary serialization verification: PASSED");
104
105 Ok(())
106}
107
108/// Demonstrate optimizer serialization and deserialization
109fn demonstrate_optimizer_serialization() -> Result<(), Box<dyn std::error::Error>> {
110 println!("\n--- Optimizer Serialization ---");
111
112 // Create an optimizer with some parameters
113 let mut weight = Tensor::randn(vec![2, 2], Some(42)).with_requires_grad();
114 let mut bias = Tensor::randn(vec![2], Some(43)).with_requires_grad();
115
116 let config = AdamConfig {
117 learning_rate: 0.001,
118 beta1: 0.9,
119 beta2: 0.999,
120 eps: 1e-8,
121 weight_decay: 0.0,
122 amsgrad: false,
123 };
124
125 let mut optimizer = Adam::with_config(config);
126 optimizer.add_parameter(&weight);
127 optimizer.add_parameter(&bias);
128
129 println!(
130 "Created optimizer with {} parameters",
131 optimizer.parameter_count()
132 );
133 println!("Learning rate: {}", optimizer.learning_rate());
134
135 // Simulate some training steps
136 for _ in 0..3 {
137 let mut loss = weight.sum() + bias.sum();
138 loss.backward(None);
139 optimizer.step(&mut [&mut weight, &mut bias]);
140 optimizer.zero_grad(&mut [&mut weight, &mut bias]);
141 }
142
143 // Save optimizer state
144 let optimizer_path = "temp_optimizer.json";
145 optimizer.save_json(optimizer_path)?;
146 println!("Saved optimizer to: {}", optimizer_path);
147
148 // Load optimizer state
149 let loaded_optimizer = Adam::load_json(optimizer_path)?;
150 println!(
151 "Loaded optimizer with {} parameters",
152 loaded_optimizer.parameter_count()
153 );
154 println!("Learning rate: {}", loaded_optimizer.learning_rate());
155
156 // Verify optimizer state
157 assert_eq!(
158 optimizer.parameter_count(),
159 loaded_optimizer.parameter_count()
160 );
161 assert_eq!(optimizer.learning_rate(), loaded_optimizer.learning_rate());
162 println!("Optimizer serialization verification: PASSED");
163
164 Ok(())
165}
166
167/// Demonstrate format comparison and performance characteristics
168fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
169 println!("\n--- Format Comparison ---");
170
171 // Create a larger tensor for comparison
172 let tensor = Tensor::randn(vec![10, 10], Some(44));
173
174 // Save in both formats
175 tensor.save_json("temp_comparison.json")?;
176 tensor.save_binary("temp_comparison.bin")?;
177
178 // Compare file sizes
179 let json_size = fs::metadata("temp_comparison.json")?.len();
180 let binary_size = fs::metadata("temp_comparison.bin")?.len();
181
182 println!("JSON file size: {} bytes", json_size);
183 println!("Binary file size: {} bytes", binary_size);
184 println!(
185 "Compression ratio: {:.2}x",
186 json_size as f64 / binary_size as f64
187 );
188
189 // Load and verify both formats
190 let json_tensor = Tensor::load_json("temp_comparison.json")?;
191 let binary_tensor = Tensor::load_binary("temp_comparison.bin")?;
192
193 assert_eq!(tensor.shape().dims, json_tensor.shape().dims);
194 assert_eq!(tensor.shape().dims, binary_tensor.shape().dims);
195 assert_eq!(tensor.data(), json_tensor.data());
196 assert_eq!(tensor.data(), binary_tensor.data());
197
198 println!("Format comparison verification: PASSED");
199
200 Ok(())
201}
202
203/// Demonstrate a basic model checkpointing workflow
204fn demonstrate_model_checkpointing() -> Result<(), Box<dyn std::error::Error>> {
205 println!("\n--- Model Checkpointing ---");
206
207 // Create a simple model (weights and bias)
208 let mut weights = Tensor::randn(vec![2, 1], Some(45)).with_requires_grad();
209 let mut bias = Tensor::randn(vec![1], Some(46)).with_requires_grad();
210
211 // Create optimizer
212 let mut optimizer = Adam::with_learning_rate(0.01);
213 optimizer.add_parameter(&weights);
214 optimizer.add_parameter(&bias);
215
216 println!("Initial weights: {:?}", weights.data());
217 println!("Initial bias: {:?}", bias.data());
218
219 // Simulate training
220 for epoch in 0..5 {
221 let mut loss = weights.sum() + bias.sum();
222 loss.backward(None);
223 optimizer.step(&mut [&mut weights, &mut bias]);
224 optimizer.zero_grad(&mut [&mut weights, &mut bias]);
225
226 if epoch % 2 == 0 {
227 // Save checkpoint
228 let checkpoint_dir = format!("checkpoint_epoch_{}", epoch);
229 fs::create_dir_all(&checkpoint_dir)?;
230
231 weights.save_json(format!("{}/weights.json", checkpoint_dir))?;
232 bias.save_json(format!("{}/bias.json", checkpoint_dir))?;
233 optimizer.save_json(format!("{}/optimizer.json", checkpoint_dir))?;
234
235 println!("Saved checkpoint for epoch {}", epoch);
236 }
237 }
238
239 // Load from checkpoint
240 let loaded_weights = Tensor::load_json("checkpoint_epoch_4/weights.json")?;
241 let loaded_bias = Tensor::load_json("checkpoint_epoch_4/bias.json")?;
242 let loaded_optimizer = Adam::load_json("checkpoint_epoch_4/optimizer.json")?;
243
244 println!("Loaded weights: {:?}", loaded_weights.data());
245 println!("Loaded bias: {:?}", loaded_bias.data());
246 println!(
247 "Loaded optimizer learning rate: {}",
248 loaded_optimizer.learning_rate()
249 );
250
251 // Verify checkpoint integrity
252 assert_eq!(weights.shape().dims, loaded_weights.shape().dims);
253 assert_eq!(bias.shape().dims, loaded_bias.shape().dims);
254 assert_eq!(optimizer.learning_rate(), loaded_optimizer.learning_rate());
255
256 println!("Checkpointing verification: PASSED");
257
258 Ok(())
259}
260
261/// Demonstrate error handling for serialization operations
262fn demonstrate_error_handling() -> Result<(), Box<dyn std::error::Error>> {
263 println!("\n--- Error Handling ---");
264
265 // Test loading non-existent file
266 match Tensor::load_json("nonexistent_file.json") {
267 Ok(_) => println!("Unexpected: Successfully loaded non-existent file"),
268 Err(e) => println!("Expected error loading non-existent file: {}", e),
269 }
270
271 // Test loading with wrong format
272 let tensor = Tensor::randn(vec![2, 2], Some(47));
273 tensor.save_binary("temp_binary.bin")?;
274
275 match Tensor::load_json("temp_binary.bin") {
276 Ok(_) => println!("Unexpected: Successfully loaded binary as JSON"),
277 Err(e) => println!("Expected error loading binary as JSON: {}", e),
278 }
279
280 // Test loading corrupted file
281 fs::write("temp_invalid.json", "invalid json content")?;
282 match Tensor::load_json("temp_invalid.json") {
283 Ok(_) => println!("Unexpected: Successfully loaded invalid JSON"),
284 Err(e) => println!("Expected error loading invalid JSON: {}", e),
285 }
286
287 println!("Error handling verification: PASSED");
288
289 Ok(())
290}More examples
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}345fn demonstrate_common_error_scenarios() -> Result<(), Box<dyn std::error::Error>> {
346 println!("--- Common Error Scenarios ---");
347
348 fs::create_dir_all("temp_error_tests")?;
349
350 // Scenario 1: Corrupted JSON file
351 println!("1. Corrupted JSON File:");
352 let corrupted_json = r#"{"name": "test", "value": 42, "incomplete"#;
353 fs::write("temp_error_tests/corrupted.json", corrupted_json)?;
354
355 match VersionedData::load_json("temp_error_tests/corrupted.json") {
356 Ok(_) => println!(" Unexpected: Corrupted JSON was parsed successfully"),
357 Err(e) => println!(" Expected error: {}", e),
358 }
359
360 // Scenario 2: Missing required fields
361 println!("\n2. Missing Required Fields:");
362 let incomplete_json = r#"{"name": "test"}"#;
363 fs::write("temp_error_tests/incomplete.json", incomplete_json)?;
364
365 match VersionedData::load_json("temp_error_tests/incomplete.json") {
366 Ok(_) => println!(" Unexpected: Incomplete JSON was parsed successfully"),
367 Err(e) => println!(" Expected error: {}", e),
368 }
369
370 // Scenario 3: Type mismatches
371 println!("\n3. Type Mismatch:");
372 let type_mismatch_json = r#"{"version": "not_a_number", "name": "test", "value": 42.0}"#;
373 fs::write("temp_error_tests/type_mismatch.json", type_mismatch_json)?;
374
375 match VersionedData::load_json("temp_error_tests/type_mismatch.json") {
376 Ok(_) => println!(" Unexpected: Type mismatch was handled gracefully"),
377 Err(e) => println!(" Expected error: {}", e),
378 }
379
380 // Scenario 4: File not found
381 println!("\n4. File Not Found:");
382 match VersionedData::load_json("temp_error_tests/nonexistent.json") {
383 Ok(_) => println!(" Unexpected: Non-existent file was loaded"),
384 Err(e) => println!(" Expected error: {}", e),
385 }
386
387 // Scenario 5: Binary format mismatch
388 println!("\n5. Binary Format Mismatch:");
389 let invalid_binary = vec![0xFF, 0xFF, 0xFF, 0xFF]; // Invalid binary data
390 fs::write("temp_error_tests/invalid.bin", invalid_binary)?;
391
392 match VersionedData::load_binary("temp_error_tests/invalid.bin") {
393 Ok(_) => println!(" Unexpected: Invalid binary was parsed successfully"),
394 Err(e) => println!(" Expected error: {}", e),
395 }
396
397 // Scenario 6: Wrong format loading
398 println!("\n6. Wrong Format Loading:");
399 let valid_data = VersionedData {
400 version: 1,
401 name: "test".to_string(),
402 value: 42.0,
403 optional_field: None,
404 new_field: None,
405 };
406 valid_data.save_binary("temp_error_tests/valid.bin")?;
407
408 // Try to load binary file as JSON
409 match VersionedData::load_json("temp_error_tests/valid.bin") {
410 Ok(_) => println!(" Unexpected: Binary file was loaded as JSON"),
411 Err(e) => println!(" Expected error: {}", e),
412 }
413
414 Ok(())
415}630fn demonstrate_use_case_recommendations() -> Result<(), Box<dyn std::error::Error>> {
631 println!("\n--- Use Case Recommendations ---");
632
633 println!("JSON Format - Recommended for:");
634 println!(" ✓ Configuration files (human-editable)");
635 println!(" ✓ API responses (web compatibility)");
636 println!(" ✓ Debugging and development (readability)");
637 println!(" ✓ Small data structures (minimal overhead)");
638 println!(" ✓ Cross-language interoperability");
639 println!(" ✓ Schema evolution (self-describing)");
640 println!(" ✓ Text-heavy data with few numbers");
641
642 println!("\nBinary Format - Recommended for:");
643 println!(" ✓ Large datasets (memory/storage efficiency)");
644 println!(" ✓ High-performance applications (speed critical)");
645 println!(" ✓ Numeric-heavy data (ML models, matrices)");
646 println!(" ✓ Network transmission (bandwidth limited)");
647 println!(" ✓ Embedded systems (resource constrained)");
648 println!(" ✓ Long-term storage (space efficiency)");
649 println!(" ✓ Frequent serialization/deserialization");
650
651 // Demonstrate decision matrix
652 println!("\nDecision Matrix Example:");
653
654 let scenarios = vec![
655 (
656 "Web API Configuration",
657 "JSON",
658 "Human readable, web standard, small size",
659 ),
660 (
661 "ML Model Weights",
662 "Binary",
663 "Large numeric data, performance critical",
664 ),
665 (
666 "User Preferences",
667 "JSON",
668 "Human editable, self-documenting",
669 ),
670 (
671 "Real-time Telemetry",
672 "Binary",
673 "High frequency, bandwidth limited",
674 ),
675 (
676 "Application Settings",
677 "JSON",
678 "Developer accessible, version control friendly",
679 ),
680 (
681 "Scientific Dataset",
682 "Binary",
683 "Large arrays, storage efficiency critical",
684 ),
685 ];
686
687 for (scenario, recommendation, reason) in scenarios {
688 println!(" {} -> {} ({})", scenario, recommendation, reason);
689 }
690
691 // Create examples for common scenarios
692 println!("\nPractical Examples:");
693
694 // Configuration file example (JSON)
695 let config = Configuration {
696 version: "2.1.0".to_string(),
697 debug_enabled: false,
698 log_level: "info".to_string(),
699 database_settings: {
700 let mut map = HashMap::new();
701 map.insert("url".to_string(), "postgresql://localhost/app".to_string());
702 map.insert("pool_size".to_string(), "10".to_string());
703 map
704 },
705 feature_flags_enabled: true,
706 max_connections: 100.0,
707 timeout_seconds: 30.0,
708 };
709
710 config.save_json("temp_config_example.json")?;
711 let config_content = fs::read_to_string("temp_config_example.json")?;
712
713 println!("\nConfiguration File (JSON) - Human readable:");
714 for line in config_content.lines().take(5) {
715 println!(" {}", line);
716 }
717 println!(" ... (easily editable by developers)");
718
719 // Data export example (Binary)
720 let export_data = LargeDataset {
721 name: "Training Export".to_string(),
722 values: (0..1000).map(|i| (i as f32).sin()).collect(),
723 labels: (0..1000).map(|i| format!("sample_{:04}", i)).collect(),
724 feature_count: 50,
725 feature_dimension: 20,
726 timestamp_count: 1000,
727 metadata: HashMap::new(),
728 };
729
730 export_data.save_binary("temp_export_example.bin")?;
731 let export_size = fs::metadata("temp_export_example.bin")?.len();
732
733 println!("\nData Export (Binary) - Efficient storage:");
734 println!(
735 " File size: {} bytes ({:.1} KB)",
736 export_size,
737 export_size as f64 / 1024.0
738 );
739 println!(" 1000 numeric values + 50x20 matrix + metadata");
740 println!(" Compact encoding saves significant space vs JSON");
741
742 Ok(())
743}
744
745/// Demonstrate debugging capabilities
746fn demonstrate_debugging_capabilities() -> Result<(), Box<dyn std::error::Error>> {
747 println!("\n--- Debugging Capabilities ---");
748
749 let mut metadata = HashMap::new();
750 metadata.insert("debug_session".to_string(), "session_123".to_string());
751 metadata.insert("error_code".to_string(), "E001".to_string());
752
753 let debug_metrics = PerformanceMetrics {
754 operation: "debug_test".to_string(),
755 duration_micros: 5432,
756 memory_usage_bytes: 16384,
757 cpu_usage_percent: 42.7,
758 throughput_ops_per_sec: 750.0,
759 metadata,
760 };
761
762 println!("Debugging Comparison:");
763
764 // JSON debugging advantages
765 let json_data = debug_metrics.to_json()?;
766 println!("\nJSON Format - Debugging Advantages:");
767 println!(" ✓ Human readable without tools");
768 println!(" ✓ Can inspect values directly");
769 println!(" ✓ Text editors show structure");
770 println!(" ✓ Diff tools work naturally");
771 println!(" ✓ Version control friendly");
772
773 println!("\n Sample JSON output for debugging:");
774 for (i, line) in json_data.lines().enumerate() {
775 if i < 5 {
776 println!(" {}", line);
777 }
778 }
779
780 // Binary debugging limitations
781 let binary_data = debug_metrics.to_binary()?;
782 println!("\nBinary Format - Debugging Limitations:");
783 println!(" ✗ Requires special tools to inspect");
784 println!(" ✗ Not human readable");
785 println!(" ✗ Difficult to debug data corruption");
786 println!(" ✗ Version control shows as binary diff");
787
788 println!("\n Binary data (hex dump for debugging):");
789 print!(" ");
790 for (i, byte) in binary_data.iter().take(40).enumerate() {
791 if i > 0 && i % 16 == 0 {
792 println!();
793 print!(" ");
794 }
795 print!("{:02x} ", byte);
796 }
797 println!("\n (requires hex editor or custom tools)");
798
799 // Development workflow comparison
800 println!("\nDevelopment Workflow Impact:");
801
802 println!("\nJSON Workflow:");
803 println!(" 1. Save data to JSON file");
804 println!(" 2. Open in any text editor");
805 println!(" 3. Inspect values directly");
806 println!(" 4. Make manual edits if needed");
807 println!(" 5. Version control tracks changes");
808
809 println!("\nBinary Workflow:");
810 println!(" 1. Save data to binary file");
811 println!(" 2. Write debugging code to load and print");
812 println!(" 3. Use hex editor for low-level inspection");
813 println!(" 4. Cannot make manual edits easily");
814 println!(" 5. Version control shows binary changes only");
815
816 // Hybrid approach recommendation
817 println!("\nHybrid Approach for Development:");
818 println!(" - Use JSON during development/debugging");
819 println!(" - Switch to binary for production deployment");
820 println!(" - Provide debugging tools that export binary to JSON");
821 println!(" - Include format conversion utilities");
822
823 // Demonstrate debugging scenario
824 println!("\nDebugging Scenario Example:");
825 println!(" Problem: Performance metrics show unexpected values");
826
827 // Save both formats for comparison
828 debug_metrics.save_json("temp_debug_metrics.json")?;
829 debug_metrics.save_binary("temp_debug_metrics.bin")?;
830
831 println!(" JSON approach: Open temp_debug_metrics.json in editor");
832 println!(" -> Immediately see cpu_usage_percent: 42.7");
833 println!(" -> Compare with expected range");
834 println!(" -> Check metadata for debug_session: 'session_123'");
835
836 println!(" Binary approach: Write debugging code");
837 println!(" -> Load binary file programmatically");
838 println!(" -> Print values to console");
839 println!(" -> Additional development time required");
840
841 Ok(())
842}Sourcefn load_json<P: AsRef<Path>>(path: P) -> SerializationResult<Self>
fn load_json<P: AsRef<Path>>(path: P) -> SerializationResult<Self>
Loads the struct from a JSON file
Reads JSON data from the specified file path and deserializes it into a new instance of the struct.
§Arguments
path- File path containing the JSON data to read
§Returns
Ok(Self) on successful deserialization
Err(SerializationError) if file I/O or deserialization fails
§Examples
Loads struct data from a JSON file with proper error handling.
Examples found in repository?
106 pub fn load_json(
107 path: &str,
108 input_size: usize,
109 output_size: usize,
110 ) -> Result<Self, Box<dyn std::error::Error>> {
111 let weight_path = format!("{}_weight.json", path);
112 let bias_path = format!("{}_bias.json", path);
113
114 let weight = Tensor::load_json(&weight_path)?.with_requires_grad();
115 let bias = Tensor::load_json(&bias_path)?.with_requires_grad();
116
117 Ok(Self {
118 weight,
119 bias,
120 input_size,
121 output_size,
122 })
123 }More examples
224fn demonstrate_user_profile_serialization() -> Result<(), Box<dyn std::error::Error>> {
225 println!("--- User Profile Serialization ---");
226
227 // Create a user profile with various field types
228 let user = UserProfile {
229 id: 12345,
230 username: "alice_cooper".to_string(),
231 email: "alice@example.com".to_string(),
232 age: 28,
233 is_active: true,
234 score: 95.7,
235 };
236
237 println!("Original user profile:");
238 println!(" ID: {}", user.id);
239 println!(" Username: {}", user.username);
240 println!(" Email: {}", user.email);
241 println!(" Age: {}", user.age);
242 println!(" Active: {}", user.is_active);
243 println!(" Score: {}", user.score);
244
245 // Serialize to JSON
246 let json_data = user.to_json()?;
247 println!("\nSerialized to JSON:");
248 println!("{}", json_data);
249
250 // Save to JSON file
251 user.save_json("temp_user_profile.json")?;
252 println!("Saved to file: temp_user_profile.json");
253
254 // Load from JSON file
255 let loaded_user = UserProfile::load_json("temp_user_profile.json")?;
256 println!("\nLoaded user profile:");
257 println!(" ID: {}", loaded_user.id);
258 println!(" Username: {}", loaded_user.username);
259 println!(" Email: {}", loaded_user.email);
260 println!(" Age: {}", loaded_user.age);
261 println!(" Active: {}", loaded_user.is_active);
262 println!(" Score: {}", loaded_user.score);
263
264 // Verify data integrity
265 assert_eq!(user, loaded_user);
266 println!("Data integrity verification: PASSED");
267
268 Ok(())
269}
270
271/// Demonstrate serialization with collections and optional fields
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}227 pub fn load_json(
228 path: &str,
229 config: FeedForwardConfig,
230 ) -> Result<Self, Box<dyn std::error::Error>> {
231 let mut layers = Vec::new();
232 let mut current_size = config.input_size;
233 let mut layer_idx = 0;
234
235 // Load hidden layers
236 for &hidden_size in &config.hidden_sizes {
237 let layer_path = format!("{}_layer_{}", path, layer_idx);
238 let weight_path = format!("{}_weight.json", layer_path);
239 let bias_path = format!("{}_bias.json", layer_path);
240
241 let weight = Tensor::load_json(&weight_path)?.with_requires_grad();
242 let bias = Tensor::load_json(&bias_path)?.with_requires_grad();
243
244 layers.push(LinearLayer {
245 weight,
246 bias,
247 input_size: current_size,
248 output_size: hidden_size,
249 });
250
251 current_size = hidden_size;
252 layer_idx += 1;
253 }
254
255 // Load output layer
256 let layer_path = format!("{}_layer_{}", path, layer_idx);
257 let weight_path = format!("{}_weight.json", layer_path);
258 let bias_path = format!("{}_bias.json", layer_path);
259
260 let weight = Tensor::load_json(&weight_path)?.with_requires_grad();
261 let bias = Tensor::load_json(&bias_path)?.with_requires_grad();
262
263 layers.push(LinearLayer {
264 weight,
265 bias,
266 input_size: current_size,
267 output_size: config.output_size,
268 });
269
270 Ok(Self { layers, config })
271 }52fn demonstrate_tensor_serialization() -> Result<(), Box<dyn std::error::Error>> {
53 println!("--- Tensor Serialization ---");
54
55 // Create a tensor with some data
56 let original_tensor = Tensor::from_slice(&[1.0, 2.0, 3.0, 4.0, 5.0, 6.0], vec![2, 3]).unwrap();
57 println!(
58 "Original tensor: shape {:?}, data: {:?}",
59 original_tensor.shape().dims,
60 original_tensor.data()
61 );
62
63 // Save tensor in JSON format
64 let json_path = "temp_tensor.json";
65 original_tensor.save_json(json_path)?;
66 println!("Saved tensor to JSON: {}", json_path);
67
68 // Load tensor from JSON
69 let loaded_tensor_json = Tensor::load_json(json_path)?;
70 println!(
71 "Loaded from JSON: shape {:?}, data: {:?}",
72 loaded_tensor_json.shape().dims,
73 loaded_tensor_json.data()
74 );
75
76 // Verify data integrity
77 assert_eq!(
78 original_tensor.shape().dims,
79 loaded_tensor_json.shape().dims
80 );
81 assert_eq!(original_tensor.data(), loaded_tensor_json.data());
82 println!("JSON serialization verification: PASSED");
83
84 // Save tensor in binary format
85 let binary_path = "temp_tensor.bin";
86 original_tensor.save_binary(binary_path)?;
87 println!("Saved tensor to binary: {}", binary_path);
88
89 // Load tensor from binary
90 let loaded_tensor_binary = Tensor::load_binary(binary_path)?;
91 println!(
92 "Loaded from binary: shape {:?}, data: {:?}",
93 loaded_tensor_binary.shape().dims,
94 loaded_tensor_binary.data()
95 );
96
97 // Verify data integrity
98 assert_eq!(
99 original_tensor.shape().dims,
100 loaded_tensor_binary.shape().dims
101 );
102 assert_eq!(original_tensor.data(), loaded_tensor_binary.data());
103 println!("Binary serialization verification: PASSED");
104
105 Ok(())
106}
107
108/// Demonstrate optimizer serialization and deserialization
109fn demonstrate_optimizer_serialization() -> Result<(), Box<dyn std::error::Error>> {
110 println!("\n--- Optimizer Serialization ---");
111
112 // Create an optimizer with some parameters
113 let mut weight = Tensor::randn(vec![2, 2], Some(42)).with_requires_grad();
114 let mut bias = Tensor::randn(vec![2], Some(43)).with_requires_grad();
115
116 let config = AdamConfig {
117 learning_rate: 0.001,
118 beta1: 0.9,
119 beta2: 0.999,
120 eps: 1e-8,
121 weight_decay: 0.0,
122 amsgrad: false,
123 };
124
125 let mut optimizer = Adam::with_config(config);
126 optimizer.add_parameter(&weight);
127 optimizer.add_parameter(&bias);
128
129 println!(
130 "Created optimizer with {} parameters",
131 optimizer.parameter_count()
132 );
133 println!("Learning rate: {}", optimizer.learning_rate());
134
135 // Simulate some training steps
136 for _ in 0..3 {
137 let mut loss = weight.sum() + bias.sum();
138 loss.backward(None);
139 optimizer.step(&mut [&mut weight, &mut bias]);
140 optimizer.zero_grad(&mut [&mut weight, &mut bias]);
141 }
142
143 // Save optimizer state
144 let optimizer_path = "temp_optimizer.json";
145 optimizer.save_json(optimizer_path)?;
146 println!("Saved optimizer to: {}", optimizer_path);
147
148 // Load optimizer state
149 let loaded_optimizer = Adam::load_json(optimizer_path)?;
150 println!(
151 "Loaded optimizer with {} parameters",
152 loaded_optimizer.parameter_count()
153 );
154 println!("Learning rate: {}", loaded_optimizer.learning_rate());
155
156 // Verify optimizer state
157 assert_eq!(
158 optimizer.parameter_count(),
159 loaded_optimizer.parameter_count()
160 );
161 assert_eq!(optimizer.learning_rate(), loaded_optimizer.learning_rate());
162 println!("Optimizer serialization verification: PASSED");
163
164 Ok(())
165}
166
167/// Demonstrate format comparison and performance characteristics
168fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
169 println!("\n--- Format Comparison ---");
170
171 // Create a larger tensor for comparison
172 let tensor = Tensor::randn(vec![10, 10], Some(44));
173
174 // Save in both formats
175 tensor.save_json("temp_comparison.json")?;
176 tensor.save_binary("temp_comparison.bin")?;
177
178 // Compare file sizes
179 let json_size = fs::metadata("temp_comparison.json")?.len();
180 let binary_size = fs::metadata("temp_comparison.bin")?.len();
181
182 println!("JSON file size: {} bytes", json_size);
183 println!("Binary file size: {} bytes", binary_size);
184 println!(
185 "Compression ratio: {:.2}x",
186 json_size as f64 / binary_size as f64
187 );
188
189 // Load and verify both formats
190 let json_tensor = Tensor::load_json("temp_comparison.json")?;
191 let binary_tensor = Tensor::load_binary("temp_comparison.bin")?;
192
193 assert_eq!(tensor.shape().dims, json_tensor.shape().dims);
194 assert_eq!(tensor.shape().dims, binary_tensor.shape().dims);
195 assert_eq!(tensor.data(), json_tensor.data());
196 assert_eq!(tensor.data(), binary_tensor.data());
197
198 println!("Format comparison verification: PASSED");
199
200 Ok(())
201}
202
203/// Demonstrate a basic model checkpointing workflow
204fn demonstrate_model_checkpointing() -> Result<(), Box<dyn std::error::Error>> {
205 println!("\n--- Model Checkpointing ---");
206
207 // Create a simple model (weights and bias)
208 let mut weights = Tensor::randn(vec![2, 1], Some(45)).with_requires_grad();
209 let mut bias = Tensor::randn(vec![1], Some(46)).with_requires_grad();
210
211 // Create optimizer
212 let mut optimizer = Adam::with_learning_rate(0.01);
213 optimizer.add_parameter(&weights);
214 optimizer.add_parameter(&bias);
215
216 println!("Initial weights: {:?}", weights.data());
217 println!("Initial bias: {:?}", bias.data());
218
219 // Simulate training
220 for epoch in 0..5 {
221 let mut loss = weights.sum() + bias.sum();
222 loss.backward(None);
223 optimizer.step(&mut [&mut weights, &mut bias]);
224 optimizer.zero_grad(&mut [&mut weights, &mut bias]);
225
226 if epoch % 2 == 0 {
227 // Save checkpoint
228 let checkpoint_dir = format!("checkpoint_epoch_{}", epoch);
229 fs::create_dir_all(&checkpoint_dir)?;
230
231 weights.save_json(format!("{}/weights.json", checkpoint_dir))?;
232 bias.save_json(format!("{}/bias.json", checkpoint_dir))?;
233 optimizer.save_json(format!("{}/optimizer.json", checkpoint_dir))?;
234
235 println!("Saved checkpoint for epoch {}", epoch);
236 }
237 }
238
239 // Load from checkpoint
240 let loaded_weights = Tensor::load_json("checkpoint_epoch_4/weights.json")?;
241 let loaded_bias = Tensor::load_json("checkpoint_epoch_4/bias.json")?;
242 let loaded_optimizer = Adam::load_json("checkpoint_epoch_4/optimizer.json")?;
243
244 println!("Loaded weights: {:?}", loaded_weights.data());
245 println!("Loaded bias: {:?}", loaded_bias.data());
246 println!(
247 "Loaded optimizer learning rate: {}",
248 loaded_optimizer.learning_rate()
249 );
250
251 // Verify checkpoint integrity
252 assert_eq!(weights.shape().dims, loaded_weights.shape().dims);
253 assert_eq!(bias.shape().dims, loaded_bias.shape().dims);
254 assert_eq!(optimizer.learning_rate(), loaded_optimizer.learning_rate());
255
256 println!("Checkpointing verification: PASSED");
257
258 Ok(())
259}
260
261/// Demonstrate error handling for serialization operations
262fn demonstrate_error_handling() -> Result<(), Box<dyn std::error::Error>> {
263 println!("\n--- Error Handling ---");
264
265 // Test loading non-existent file
266 match Tensor::load_json("nonexistent_file.json") {
267 Ok(_) => println!("Unexpected: Successfully loaded non-existent file"),
268 Err(e) => println!("Expected error loading non-existent file: {}", e),
269 }
270
271 // Test loading with wrong format
272 let tensor = Tensor::randn(vec![2, 2], Some(47));
273 tensor.save_binary("temp_binary.bin")?;
274
275 match Tensor::load_json("temp_binary.bin") {
276 Ok(_) => println!("Unexpected: Successfully loaded binary as JSON"),
277 Err(e) => println!("Expected error loading binary as JSON: {}", e),
278 }
279
280 // Test loading corrupted file
281 fs::write("temp_invalid.json", "invalid json content")?;
282 match Tensor::load_json("temp_invalid.json") {
283 Ok(_) => println!("Unexpected: Successfully loaded invalid JSON"),
284 Err(e) => println!("Expected error loading invalid JSON: {}", e),
285 }
286
287 println!("Error handling verification: PASSED");
288
289 Ok(())
290}345fn demonstrate_common_error_scenarios() -> Result<(), Box<dyn std::error::Error>> {
346 println!("--- Common Error Scenarios ---");
347
348 fs::create_dir_all("temp_error_tests")?;
349
350 // Scenario 1: Corrupted JSON file
351 println!("1. Corrupted JSON File:");
352 let corrupted_json = r#"{"name": "test", "value": 42, "incomplete"#;
353 fs::write("temp_error_tests/corrupted.json", corrupted_json)?;
354
355 match VersionedData::load_json("temp_error_tests/corrupted.json") {
356 Ok(_) => println!(" Unexpected: Corrupted JSON was parsed successfully"),
357 Err(e) => println!(" Expected error: {}", e),
358 }
359
360 // Scenario 2: Missing required fields
361 println!("\n2. Missing Required Fields:");
362 let incomplete_json = r#"{"name": "test"}"#;
363 fs::write("temp_error_tests/incomplete.json", incomplete_json)?;
364
365 match VersionedData::load_json("temp_error_tests/incomplete.json") {
366 Ok(_) => println!(" Unexpected: Incomplete JSON was parsed successfully"),
367 Err(e) => println!(" Expected error: {}", e),
368 }
369
370 // Scenario 3: Type mismatches
371 println!("\n3. Type Mismatch:");
372 let type_mismatch_json = r#"{"version": "not_a_number", "name": "test", "value": 42.0}"#;
373 fs::write("temp_error_tests/type_mismatch.json", type_mismatch_json)?;
374
375 match VersionedData::load_json("temp_error_tests/type_mismatch.json") {
376 Ok(_) => println!(" Unexpected: Type mismatch was handled gracefully"),
377 Err(e) => println!(" Expected error: {}", e),
378 }
379
380 // Scenario 4: File not found
381 println!("\n4. File Not Found:");
382 match VersionedData::load_json("temp_error_tests/nonexistent.json") {
383 Ok(_) => println!(" Unexpected: Non-existent file was loaded"),
384 Err(e) => println!(" Expected error: {}", e),
385 }
386
387 // Scenario 5: Binary format mismatch
388 println!("\n5. Binary Format Mismatch:");
389 let invalid_binary = vec![0xFF, 0xFF, 0xFF, 0xFF]; // Invalid binary data
390 fs::write("temp_error_tests/invalid.bin", invalid_binary)?;
391
392 match VersionedData::load_binary("temp_error_tests/invalid.bin") {
393 Ok(_) => println!(" Unexpected: Invalid binary was parsed successfully"),
394 Err(e) => println!(" Expected error: {}", e),
395 }
396
397 // Scenario 6: Wrong format loading
398 println!("\n6. Wrong Format Loading:");
399 let valid_data = VersionedData {
400 version: 1,
401 name: "test".to_string(),
402 value: 42.0,
403 optional_field: None,
404 new_field: None,
405 };
406 valid_data.save_binary("temp_error_tests/valid.bin")?;
407
408 // Try to load binary file as JSON
409 match VersionedData::load_json("temp_error_tests/valid.bin") {
410 Ok(_) => println!(" Unexpected: Binary file was loaded as JSON"),
411 Err(e) => println!(" Expected error: {}", e),
412 }
413
414 Ok(())
415}
416
417/// Demonstrate validation patterns
418fn demonstrate_validation_patterns() -> Result<(), Box<dyn std::error::Error>> {
419 println!("\n--- Validation Patterns ---");
420
421 println!("Testing input validation with various scenarios:");
422
423 // Valid input
424 println!("\n1. Valid Input:");
425 let mut valid_preferences = HashMap::new();
426 valid_preferences.insert("theme".to_string(), "dark".to_string());
427 valid_preferences.insert("language".to_string(), "en".to_string());
428
429 let valid_input = ValidatedUserInput {
430 username: "john_doe".to_string(),
431 email: "john@example.com".to_string(),
432 age: 25,
433 preferences: valid_preferences,
434 };
435
436 match valid_input.to_json() {
437 Ok(json) => {
438 println!(" ✓ Valid input serialized successfully");
439 match ValidatedUserInput::from_json(&json) {
440 Ok(_) => println!(" ✓ Valid input deserialized successfully"),
441 Err(e) => println!(" ✗ Deserialization failed: {}", e),
442 }
443 }
444 Err(e) => println!(" ✗ Serialization failed: {}", e),
445 }
446
447 // Test validation errors
448 let validation_tests = vec![
449 (
450 "Empty username",
451 ValidatedUserInput {
452 username: "".to_string(),
453 email: "test@example.com".to_string(),
454 age: 25,
455 preferences: HashMap::new(),
456 },
457 ),
458 (
459 "Invalid username characters",
460 ValidatedUserInput {
461 username: "user@name!".to_string(),
462 email: "test@example.com".to_string(),
463 age: 25,
464 preferences: HashMap::new(),
465 },
466 ),
467 (
468 "Invalid email",
469 ValidatedUserInput {
470 username: "username".to_string(),
471 email: "invalid_email".to_string(),
472 age: 25,
473 preferences: HashMap::new(),
474 },
475 ),
476 (
477 "Age too low",
478 ValidatedUserInput {
479 username: "username".to_string(),
480 email: "test@example.com".to_string(),
481 age: 10,
482 preferences: HashMap::new(),
483 },
484 ),
485 (
486 "Age too high",
487 ValidatedUserInput {
488 username: "username".to_string(),
489 email: "test@example.com".to_string(),
490 age: 150,
491 preferences: HashMap::new(),
492 },
493 ),
494 ];
495
496 for (description, invalid_input) in validation_tests {
497 println!("\n2. {}:", description);
498 match invalid_input.to_json() {
499 Ok(json) => match ValidatedUserInput::from_json(&json) {
500 Ok(_) => println!(" ✗ Unexpected: Invalid input was accepted"),
501 Err(e) => println!(" ✓ Expected validation error: {}", e),
502 },
503 Err(e) => println!(" ✗ Serialization error: {}", e),
504 }
505 }
506
507 // Test preferences validation
508 println!("\n3. Preferences Validation:");
509 let mut too_many_preferences = HashMap::new();
510 for i in 0..25 {
511 too_many_preferences.insert(format!("pref_{}", i), "value".to_string());
512 }
513
514 let invalid_prefs_input = ValidatedUserInput {
515 username: "username".to_string(),
516 email: "test@example.com".to_string(),
517 age: 25,
518 preferences: too_many_preferences,
519 };
520
521 match invalid_prefs_input.to_json() {
522 Ok(json) => match ValidatedUserInput::from_json(&json) {
523 Ok(_) => println!(" ✗ Unexpected: Too many preferences were accepted"),
524 Err(e) => println!(" ✓ Expected validation error: {}", e),
525 },
526 Err(e) => println!(" ✗ Serialization error: {}", e),
527 }
528
529 Ok(())
530}
531
532/// Demonstrate schema evolution patterns
533fn demonstrate_schema_evolution() -> Result<(), Box<dyn std::error::Error>> {
534 println!("\n--- Schema Evolution Patterns ---");
535
536 fs::create_dir_all("temp_schema_tests")?;
537
538 // Create data with different schema versions
539 println!("Creating data with different schema versions:");
540
541 // Version 1 data (minimal)
542 let v1_json = r#"{
543 "version": 1,
544 "name": "legacy_data",
545 "value": 123.45
546 }"#;
547 fs::write("temp_schema_tests/v1_data.json", v1_json)?;
548 println!(" ✓ Version 1 data created (minimal fields)");
549
550 // Version 2 data (with optional field)
551 let v2_json = r#"{
552 "version": 2,
553 "name": "v2_data",
554 "value": 678.90,
555 "optional_field": "added_in_v2"
556 }"#;
557 fs::write("temp_schema_tests/v2_data.json", v2_json)?;
558 println!(" ✓ Version 2 data created (with optional field)");
559
560 // Version 3 data (with all fields)
561 let v3_data = VersionedData {
562 version: 3,
563 name: "v3_data".to_string(),
564 value: 999.99,
565 optional_field: Some("present".to_string()),
566 new_field: Some(42),
567 };
568 v3_data.save_json("temp_schema_tests/v3_data.json")?;
569 println!(" ✓ Version 3 data created (all fields)");
570
571 // Test backward compatibility
572 println!("\nTesting backward compatibility:");
573
574 // Load v1 data with current deserializer
575 match VersionedData::load_json("temp_schema_tests/v1_data.json") {
576 Ok(data) => {
577 println!(" ✓ V1 data loaded successfully:");
578 println!(" Name: {}", data.name);
579 println!(" Value: {}", data.value);
580 println!(" Optional field: {:?}", data.optional_field);
581 println!(" New field: {:?}", data.new_field);
582 }
583 Err(e) => println!(" ✗ Failed to load V1 data: {}", e),
584 }
585
586 // Load v2 data with current deserializer
587 match VersionedData::load_json("temp_schema_tests/v2_data.json") {
588 Ok(data) => {
589 println!(" ✓ V2 data loaded successfully:");
590 println!(" Name: {}", data.name);
591 println!(" Value: {}", data.value);
592 println!(" Optional field: {:?}", data.optional_field);
593 println!(" New field: {:?}", data.new_field);
594 }
595 Err(e) => println!(" ✗ Failed to load V2 data: {}", e),
596 }
597
598 // Test future version rejection
599 println!("\nTesting future version handling:");
600 let future_version_json = r#"{
601 "version": 99,
602 "name": "future_data",
603 "value": 123.45,
604 "unknown_field": "should_be_ignored"
605 }"#;
606 fs::write("temp_schema_tests/future_data.json", future_version_json)?;
607
608 match VersionedData::load_json("temp_schema_tests/future_data.json") {
609 Ok(_) => println!(" ✗ Unexpected: Future version was accepted"),
610 Err(e) => println!(" ✓ Expected rejection of future version: {}", e),
611 }
612
613 // Demonstrate migration strategy
614 println!("\nDemonstrating migration strategy:");
615 println!(" Strategy: Load old format, upgrade to new format, save");
616
617 // Simulate migrating v1 data to v3 format
618 let v1_loaded = VersionedData::load_json("temp_schema_tests/v1_data.json")?;
619 let v1_upgraded = VersionedData {
620 version: 3,
621 name: v1_loaded.name,
622 value: v1_loaded.value,
623 optional_field: Some("migrated_default".to_string()),
624 new_field: Some(0),
625 };
626
627 v1_upgraded.save_json("temp_schema_tests/v1_migrated.json")?;
628 println!(" ✓ V1 data migrated to V3 format");
629
630 Ok(())
631}
632
633/// Demonstrate recovery strategies
634fn demonstrate_recovery_strategies() -> Result<(), Box<dyn std::error::Error>> {
635 println!("\n--- Recovery Strategies ---");
636
637 fs::create_dir_all("temp_recovery_tests")?;
638
639 // Strategy 1: Graceful degradation
640 println!("1. Graceful Degradation Strategy:");
641
642 // Create complete data
643 let complete_data = RecoverableData {
644 critical_field: "essential_info".to_string(),
645 important_field: Some("important_info".to_string()),
646 optional_field: Some("nice_to_have".to_string()),
647 metadata: {
648 let mut map = HashMap::new();
649 map.insert("key1".to_string(), "value1".to_string());
650 map.insert("key2".to_string(), "value2".to_string());
651 map
652 },
653 };
654
655 // Save complete data
656 complete_data.save_json("temp_recovery_tests/complete.json")?;
657
658 // Create partial data (missing some fields)
659 let partial_json = r#"{
660 "critical_field": "essential_info",
661 "optional_field": "nice_to_have"
662 }"#;
663 fs::write("temp_recovery_tests/partial.json", partial_json)?;
664
665 // Load partial data and demonstrate recovery
666 match RecoverableData::load_json("temp_recovery_tests/partial.json") {
667 Ok(recovered) => {
668 println!(" ✓ Partial data recovered successfully:");
669 println!(" Critical field: {}", recovered.critical_field);
670 println!(
671 " Important field: {:?} (missing, set to None)",
672 recovered.important_field
673 );
674 println!(" Optional field: {:?}", recovered.optional_field);
675 println!(
676 " Metadata: {} entries (defaulted to empty)",
677 recovered.metadata.len()
678 );
679 }
680 Err(e) => println!(" ✗ Recovery failed: {}", e),
681 }
682
683 // Strategy 2: Error context preservation
684 println!("\n2. Error Context Preservation:");
685
686 let malformed_json = r#"{
687 "critical_field": "essential_info",
688 "important_field": 12345,
689 "metadata": "not_a_map"
690 }"#;
691 fs::write("temp_recovery_tests/malformed.json", malformed_json)?;
692
693 match RecoverableData::load_json("temp_recovery_tests/malformed.json") {
694 Ok(_) => println!(" ✗ Unexpected: Malformed data was accepted"),
695 Err(e) => {
696 println!(" ✓ Error context preserved:");
697 println!(" Error: {}", e);
698 println!(" Error type: {:?}", std::mem::discriminant(&e));
699 }
700 }
701
702 // Strategy 3: Fallback data sources
703 println!("\n3. Fallback Data Sources:");
704
705 // Primary source (corrupted)
706 let corrupted_primary = "corrupted data";
707 fs::write("temp_recovery_tests/primary.json", corrupted_primary)?;
708
709 // Backup source (valid)
710 let backup_data = RecoverableData {
711 critical_field: "backup_critical".to_string(),
712 important_field: Some("backup_important".to_string()),
713 optional_field: None,
714 metadata: HashMap::new(),
715 };
716 backup_data.save_json("temp_recovery_tests/backup.json")?;
717
718 // Default fallback
719 let default_data = RecoverableData {
720 critical_field: "default_critical".to_string(),
721 important_field: None,
722 optional_field: None,
723 metadata: HashMap::new(),
724 };
725
726 println!(" Attempting to load data with fallback chain:");
727
728 // Try primary source
729 let loaded_data = match RecoverableData::load_json("temp_recovery_tests/primary.json") {
730 Ok(data) => {
731 println!(" ✓ Loaded from primary source");
732 data
733 }
734 Err(_) => {
735 println!(" ✗ Primary source failed, trying backup");
736
737 // Try backup source
738 match RecoverableData::load_json("temp_recovery_tests/backup.json") {
739 Ok(data) => {
740 println!(" ✓ Loaded from backup source");
741 data
742 }
743 Err(_) => {
744 println!(" ✗ Backup source failed, using default");
745 default_data
746 }
747 }
748 }
749 };
750
751 println!(" Final loaded data:");
752 println!(" Critical field: {}", loaded_data.critical_field);
753
754 Ok(())
755}
756
757/// Demonstrate production-ready error handling
758fn demonstrate_production_error_handling() -> Result<(), Box<dyn std::error::Error>> {
759 println!("\n--- Production Error Handling ---");
760
761 fs::create_dir_all("temp_production_tests")?;
762
763 // Error logging and monitoring
764 println!("1. Error Logging and Monitoring:");
765
766 let test_data = VersionedData {
767 version: 2,
768 name: "production_test".to_string(),
769 value: 42.0,
770 optional_field: Some("test".to_string()),
771 new_field: None,
772 };
773
774 // Create error log for demonstration
775 let mut error_log = fs::OpenOptions::new()
776 .create(true)
777 .append(true)
778 .open("temp_production_tests/error.log")?;
779
780 // Function to log errors in production format
781 let mut log_error =
782 |error: &SerializationError, context: &str| -> Result<(), Box<dyn std::error::Error>> {
783 let timestamp = std::time::SystemTime::now()
784 .duration_since(std::time::UNIX_EPOCH)?
785 .as_secs();
786
787 writeln!(error_log, "[{}] ERROR in {}: {}", timestamp, context, error)?;
788 Ok(())
789 };
790
791 // Simulate various error scenarios with logging
792 let error_scenarios = vec![
793 ("corrupted_file.json", "invalid json content"),
794 ("missing_fields.json", r#"{"version": 1}"#),
795 (
796 "type_error.json",
797 r#"{"version": "not_number", "name": "test", "value": 42.0}"#,
798 ),
799 ];
800
801 for (filename, content) in error_scenarios {
802 let filepath = format!("temp_production_tests/{}", filename);
803 fs::write(&filepath, content)?;
804
805 match VersionedData::load_json(&filepath) {
806 Ok(_) => println!(" ✗ Unexpected success for {}", filename),
807 Err(e) => {
808 log_error(&e, &format!("load_config({})", filename))?;
809 println!(" ✓ Error logged for {}: {}", filename, e);
810 }
811 }
812 }
813
814 // Health check pattern
815 println!("\n2. Health Check Pattern:");
816
817 let health_check = || -> Result<bool, SerializationError> {
818 // Check if we can serialize/deserialize basic data
819 let test_data = VersionedData {
820 version: 1,
821 name: "health_check".to_string(),
822 value: 1.0,
823 optional_field: None,
824 new_field: None,
825 };
826
827 let serialized = test_data.to_json()?;
828 let _deserialized = VersionedData::from_json(&serialized)?;
829 Ok(true)
830 };
831
832 match health_check() {
833 Ok(_) => println!(" ✓ Serialization system health check passed"),
834 Err(e) => {
835 log_error(&e, "health_check")?;
836 println!(" ✗ Serialization system health check failed: {}", e);
837 }
838 }
839
840 // Circuit breaker pattern simulation
841 println!("\n3. Circuit Breaker Pattern:");
842
843 struct CircuitBreaker {
844 failure_count: u32,
845 failure_threshold: u32,
846 is_open: bool,
847 }
848
849 impl CircuitBreaker {
850 fn new(threshold: u32) -> Self {
851 Self {
852 failure_count: 0,
853 failure_threshold: threshold,
854 is_open: false,
855 }
856 }
857
858 fn call<F, T>(&mut self, operation: F) -> Result<T, String>
859 where
860 F: FnOnce() -> Result<T, SerializationError>,
861 {
862 if self.is_open {
863 return Err("Circuit breaker is open".to_string());
864 }
865
866 match operation() {
867 Ok(result) => {
868 self.failure_count = 0; // Reset on success
869 Ok(result)
870 }
871 Err(e) => {
872 self.failure_count += 1;
873 if self.failure_count >= self.failure_threshold {
874 self.is_open = true;
875 println!(
876 " Circuit breaker opened after {} failures",
877 self.failure_count
878 );
879 }
880 Err(e.to_string())
881 }
882 }
883 }
884 }
885
886 let mut circuit_breaker = CircuitBreaker::new(3);
887
888 // Simulate operations that fail
889 for i in 1..=5 {
890 let result = circuit_breaker
891 .call(|| VersionedData::load_json("temp_production_tests/corrupted_file.json"));
892
893 match result {
894 Ok(_) => println!(" Operation {} succeeded", i),
895 Err(e) => println!(" Operation {} failed: {}", i, e),
896 }
897 }
898
899 // Retry mechanism
900 println!("\n4. Retry Mechanism:");
901
902 let retry_operation = |max_attempts: u32| -> Result<VersionedData, String> {
903 for attempt in 1..=max_attempts {
904 println!(" Attempt {}/{}", attempt, max_attempts);
905
906 // Try different sources in order
907 let sources = vec![
908 "temp_production_tests/corrupted_file.json",
909 "temp_production_tests/missing_fields.json",
910 "temp_production_tests/backup_valid.json",
911 ];
912
913 if attempt == max_attempts {
914 // On final attempt, create valid backup
915 test_data
916 .save_json("temp_production_tests/backup_valid.json")
917 .map_err(|e| format!("Failed to create backup: {}", e))?;
918 }
919
920 for source in &sources {
921 match VersionedData::load_json(source) {
922 Ok(data) => {
923 println!(" ✓ Succeeded loading from {}", source);
924 return Ok(data);
925 }
926 Err(_) => {
927 println!(" ✗ Failed to load from {}", source);
928 continue;
929 }
930 }
931 }
932
933 if attempt < max_attempts {
934 println!(" Waiting before retry...");
935 // In real code, would sleep here
936 }
937 }
938
939 Err("All retry attempts exhausted".to_string())
940 };
941
942 match retry_operation(3) {
943 Ok(data) => println!(" ✓ Retry succeeded: {}", data.name),
944 Err(e) => println!(" ✗ Retry failed: {}", e),
945 }
946
947 Ok(())
948}579fn demonstrate_nested_struct_creation() -> Result<(), Box<dyn std::error::Error>> {
580 println!("--- Nested Structure Creation ---");
581
582 // Create nested address and contact info
583 let headquarters = Address {
584 street: "123 Innovation Drive".to_string(),
585 city: "Tech City".to_string(),
586 state: "CA".to_string(),
587 postal_code: "94000".to_string(),
588 country: "USA".to_string(),
589 };
590
591 let mut social_media = HashMap::new();
592 social_media.insert("twitter".to_string(), "@techcorp".to_string());
593 social_media.insert("linkedin".to_string(), "techcorp-inc".to_string());
594
595 let contact_info = ContactInfo {
596 email: "info@techcorp.com".to_string(),
597 phone: Some("+1-555-0123".to_string()),
598 address_city: headquarters.city.clone(),
599 address_state: headquarters.state.clone(),
600 social_media,
601 };
602
603 // Create departments with nested office locations
604 let engineering_office = Address {
605 street: "456 Developer Lane".to_string(),
606 city: "Code City".to_string(),
607 state: "CA".to_string(),
608 postal_code: "94001".to_string(),
609 country: "USA".to_string(),
610 };
611
612 let departments = [
613 Department {
614 name: "Engineering".to_string(),
615 manager: "Alice Johnson".to_string(),
616 employee_count: 50,
617 budget: 2500000.0,
618 office_locations: vec![engineering_office, headquarters.clone()],
619 },
620 Department {
621 name: "Marketing".to_string(),
622 manager: "Bob Smith".to_string(),
623 employee_count: 15,
624 budget: 800000.0,
625 office_locations: vec![headquarters.clone()],
626 },
627 ];
628
629 // Create projects with milestones
630 let milestones = vec![
631 Milestone {
632 name: "Requirements Analysis".to_string(),
633 description: "Complete system requirements documentation".to_string(),
634 due_date: "2024-03-15".to_string(),
635 is_completed: true,
636 progress_percentage: 100.0,
637 dependencies: vec![],
638 },
639 Milestone {
640 name: "Architecture Design".to_string(),
641 description: "Define system architecture and components".to_string(),
642 due_date: "2024-04-01".to_string(),
643 is_completed: false,
644 progress_percentage: 75.0,
645 dependencies: vec!["Requirements Analysis".to_string()],
646 },
647 ];
648
649 let mut project_metadata = HashMap::new();
650 project_metadata.insert("priority".to_string(), "high".to_string());
651 project_metadata.insert("client".to_string(), "internal".to_string());
652
653 let projects = [Project {
654 name: "Train Station ML Platform".to_string(),
655 description: "Next-generation machine learning infrastructure".to_string(),
656 status: ProjectStatus::InProgress,
657 budget: 1500000.0,
658 team_members: vec![
659 "Alice Johnson".to_string(),
660 "Charlie Brown".to_string(),
661 "Diana Prince".to_string(),
662 ],
663 milestones: milestones.clone(),
664 metadata: project_metadata,
665 }];
666
667 // Create the complete company structure
668 let mut company_metadata = HashMap::new();
669 company_metadata.insert("industry".to_string(), "technology".to_string());
670 company_metadata.insert("stock_symbol".to_string(), "TECH".to_string());
671
672 let company = Company {
673 name: "TechCorp Inc.".to_string(),
674 founded_year: 2015,
675 headquarters_city: headquarters.city.clone(),
676 headquarters_state: headquarters.state.clone(),
677 employee_count: 250,
678 department_names: departments.iter().map(|d| d.name.clone()).collect(),
679 active_project_names: projects.iter().map(|p| p.name.clone()).collect(),
680 company_metadata,
681 };
682
683 println!("Created complex company structure:");
684 println!(" Company: {}", company.name);
685 println!(" Founded: {}", company.founded_year);
686 println!(
687 " Headquarters: {}, {}",
688 company.headquarters_city, company.headquarters_state
689 );
690 println!(" Employee Count: {}", company.employee_count);
691 println!(" Departments: {}", company.department_names.len());
692 println!(" Active Projects: {}", company.active_project_names.len());
693
694 // Save the complete structure
695 company.save_json("temp_nested_company.json")?;
696 println!("Saved nested structure to: temp_nested_company.json");
697
698 // Verify loading preserves all nested data
699 let loaded_company = Company::load_json("temp_nested_company.json")?;
700 assert_eq!(company, loaded_company);
701 println!("Successfully verified Company roundtrip serialization");
702
703 // Also demonstrate individual component serialization
704 let address_json = headquarters.to_json()?;
705 let loaded_address = Address::from_json(&address_json)?;
706 assert_eq!(headquarters, loaded_address);
707 println!("Successfully serialized/deserialized Address component");
708
709 let contact_json = contact_info.to_json()?;
710 let loaded_contact = ContactInfo::from_json(&contact_json)?;
711 assert_eq!(contact_info, loaded_contact);
712 println!("Successfully serialized/deserialized ContactInfo component");
713 println!("Nested structure integrity: VERIFIED");
714
715 Ok(())
716}Sourcefn load_binary<P: AsRef<Path>>(path: P) -> SerializationResult<Self>
fn load_binary<P: AsRef<Path>>(path: P) -> SerializationResult<Self>
Loads the struct from a binary file
Reads binary data from the specified file path and deserializes it into a new instance of the struct.
§Arguments
path- File path containing the binary data to read
§Returns
Ok(Self) on successful deserialization
Err(SerializationError) if file I/O or deserialization fails
§Examples
Loads struct data from a binary file with proper error handling.
Examples found in repository?
52fn demonstrate_tensor_serialization() -> Result<(), Box<dyn std::error::Error>> {
53 println!("--- Tensor Serialization ---");
54
55 // Create a tensor with some data
56 let original_tensor = Tensor::from_slice(&[1.0, 2.0, 3.0, 4.0, 5.0, 6.0], vec![2, 3]).unwrap();
57 println!(
58 "Original tensor: shape {:?}, data: {:?}",
59 original_tensor.shape().dims,
60 original_tensor.data()
61 );
62
63 // Save tensor in JSON format
64 let json_path = "temp_tensor.json";
65 original_tensor.save_json(json_path)?;
66 println!("Saved tensor to JSON: {}", json_path);
67
68 // Load tensor from JSON
69 let loaded_tensor_json = Tensor::load_json(json_path)?;
70 println!(
71 "Loaded from JSON: shape {:?}, data: {:?}",
72 loaded_tensor_json.shape().dims,
73 loaded_tensor_json.data()
74 );
75
76 // Verify data integrity
77 assert_eq!(
78 original_tensor.shape().dims,
79 loaded_tensor_json.shape().dims
80 );
81 assert_eq!(original_tensor.data(), loaded_tensor_json.data());
82 println!("JSON serialization verification: PASSED");
83
84 // Save tensor in binary format
85 let binary_path = "temp_tensor.bin";
86 original_tensor.save_binary(binary_path)?;
87 println!("Saved tensor to binary: {}", binary_path);
88
89 // Load tensor from binary
90 let loaded_tensor_binary = Tensor::load_binary(binary_path)?;
91 println!(
92 "Loaded from binary: shape {:?}, data: {:?}",
93 loaded_tensor_binary.shape().dims,
94 loaded_tensor_binary.data()
95 );
96
97 // Verify data integrity
98 assert_eq!(
99 original_tensor.shape().dims,
100 loaded_tensor_binary.shape().dims
101 );
102 assert_eq!(original_tensor.data(), loaded_tensor_binary.data());
103 println!("Binary serialization verification: PASSED");
104
105 Ok(())
106}
107
108/// Demonstrate optimizer serialization and deserialization
109fn demonstrate_optimizer_serialization() -> Result<(), Box<dyn std::error::Error>> {
110 println!("\n--- Optimizer Serialization ---");
111
112 // Create an optimizer with some parameters
113 let mut weight = Tensor::randn(vec![2, 2], Some(42)).with_requires_grad();
114 let mut bias = Tensor::randn(vec![2], Some(43)).with_requires_grad();
115
116 let config = AdamConfig {
117 learning_rate: 0.001,
118 beta1: 0.9,
119 beta2: 0.999,
120 eps: 1e-8,
121 weight_decay: 0.0,
122 amsgrad: false,
123 };
124
125 let mut optimizer = Adam::with_config(config);
126 optimizer.add_parameter(&weight);
127 optimizer.add_parameter(&bias);
128
129 println!(
130 "Created optimizer with {} parameters",
131 optimizer.parameter_count()
132 );
133 println!("Learning rate: {}", optimizer.learning_rate());
134
135 // Simulate some training steps
136 for _ in 0..3 {
137 let mut loss = weight.sum() + bias.sum();
138 loss.backward(None);
139 optimizer.step(&mut [&mut weight, &mut bias]);
140 optimizer.zero_grad(&mut [&mut weight, &mut bias]);
141 }
142
143 // Save optimizer state
144 let optimizer_path = "temp_optimizer.json";
145 optimizer.save_json(optimizer_path)?;
146 println!("Saved optimizer to: {}", optimizer_path);
147
148 // Load optimizer state
149 let loaded_optimizer = Adam::load_json(optimizer_path)?;
150 println!(
151 "Loaded optimizer with {} parameters",
152 loaded_optimizer.parameter_count()
153 );
154 println!("Learning rate: {}", loaded_optimizer.learning_rate());
155
156 // Verify optimizer state
157 assert_eq!(
158 optimizer.parameter_count(),
159 loaded_optimizer.parameter_count()
160 );
161 assert_eq!(optimizer.learning_rate(), loaded_optimizer.learning_rate());
162 println!("Optimizer serialization verification: PASSED");
163
164 Ok(())
165}
166
167/// Demonstrate format comparison and performance characteristics
168fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
169 println!("\n--- Format Comparison ---");
170
171 // Create a larger tensor for comparison
172 let tensor = Tensor::randn(vec![10, 10], Some(44));
173
174 // Save in both formats
175 tensor.save_json("temp_comparison.json")?;
176 tensor.save_binary("temp_comparison.bin")?;
177
178 // Compare file sizes
179 let json_size = fs::metadata("temp_comparison.json")?.len();
180 let binary_size = fs::metadata("temp_comparison.bin")?.len();
181
182 println!("JSON file size: {} bytes", json_size);
183 println!("Binary file size: {} bytes", binary_size);
184 println!(
185 "Compression ratio: {:.2}x",
186 json_size as f64 / binary_size as f64
187 );
188
189 // Load and verify both formats
190 let json_tensor = Tensor::load_json("temp_comparison.json")?;
191 let binary_tensor = Tensor::load_binary("temp_comparison.bin")?;
192
193 assert_eq!(tensor.shape().dims, json_tensor.shape().dims);
194 assert_eq!(tensor.shape().dims, binary_tensor.shape().dims);
195 assert_eq!(tensor.data(), json_tensor.data());
196 assert_eq!(tensor.data(), binary_tensor.data());
197
198 println!("Format comparison verification: PASSED");
199
200 Ok(())
201}More examples
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}345fn demonstrate_common_error_scenarios() -> Result<(), Box<dyn std::error::Error>> {
346 println!("--- Common Error Scenarios ---");
347
348 fs::create_dir_all("temp_error_tests")?;
349
350 // Scenario 1: Corrupted JSON file
351 println!("1. Corrupted JSON File:");
352 let corrupted_json = r#"{"name": "test", "value": 42, "incomplete"#;
353 fs::write("temp_error_tests/corrupted.json", corrupted_json)?;
354
355 match VersionedData::load_json("temp_error_tests/corrupted.json") {
356 Ok(_) => println!(" Unexpected: Corrupted JSON was parsed successfully"),
357 Err(e) => println!(" Expected error: {}", e),
358 }
359
360 // Scenario 2: Missing required fields
361 println!("\n2. Missing Required Fields:");
362 let incomplete_json = r#"{"name": "test"}"#;
363 fs::write("temp_error_tests/incomplete.json", incomplete_json)?;
364
365 match VersionedData::load_json("temp_error_tests/incomplete.json") {
366 Ok(_) => println!(" Unexpected: Incomplete JSON was parsed successfully"),
367 Err(e) => println!(" Expected error: {}", e),
368 }
369
370 // Scenario 3: Type mismatches
371 println!("\n3. Type Mismatch:");
372 let type_mismatch_json = r#"{"version": "not_a_number", "name": "test", "value": 42.0}"#;
373 fs::write("temp_error_tests/type_mismatch.json", type_mismatch_json)?;
374
375 match VersionedData::load_json("temp_error_tests/type_mismatch.json") {
376 Ok(_) => println!(" Unexpected: Type mismatch was handled gracefully"),
377 Err(e) => println!(" Expected error: {}", e),
378 }
379
380 // Scenario 4: File not found
381 println!("\n4. File Not Found:");
382 match VersionedData::load_json("temp_error_tests/nonexistent.json") {
383 Ok(_) => println!(" Unexpected: Non-existent file was loaded"),
384 Err(e) => println!(" Expected error: {}", e),
385 }
386
387 // Scenario 5: Binary format mismatch
388 println!("\n5. Binary Format Mismatch:");
389 let invalid_binary = vec![0xFF, 0xFF, 0xFF, 0xFF]; // Invalid binary data
390 fs::write("temp_error_tests/invalid.bin", invalid_binary)?;
391
392 match VersionedData::load_binary("temp_error_tests/invalid.bin") {
393 Ok(_) => println!(" Unexpected: Invalid binary was parsed successfully"),
394 Err(e) => println!(" Expected error: {}", e),
395 }
396
397 // Scenario 6: Wrong format loading
398 println!("\n6. Wrong Format Loading:");
399 let valid_data = VersionedData {
400 version: 1,
401 name: "test".to_string(),
402 value: 42.0,
403 optional_field: None,
404 new_field: None,
405 };
406 valid_data.save_binary("temp_error_tests/valid.bin")?;
407
408 // Try to load binary file as JSON
409 match VersionedData::load_json("temp_error_tests/valid.bin") {
410 Ok(_) => println!(" Unexpected: Binary file was loaded as JSON"),
411 Err(e) => println!(" Expected error: {}", e),
412 }
413
414 Ok(())
415}Sourcefn to_json(&self) -> SerializationResult<String>
fn to_json(&self) -> SerializationResult<String>
Converts the struct to a JSON string
Serializes the struct to a human-readable JSON string format. The JSON output maintains field order and includes proper escaping.
§Returns
Ok(String) containing the JSON representation on success
Err(SerializationError) if serialization fails
§Examples
Converts struct to JSON string with proper escaping and formatting.
Examples found in repository?
85 fn to_field_value(&self) -> FieldValue {
86 match self.to_json() {
87 Ok(json_str) => FieldValue::from_json_object(json_str),
88 Err(_) => FieldValue::from_string("serialization_error".to_string()),
89 }
90 }
91}
92
93impl FromFieldValue for VersionedData {
94 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
95 // Try JSON object first
96 if let Ok(json_data) = value.as_json_object() {
97 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
98 field: field_name.to_string(),
99 message: format!("Failed to deserialize VersionedData from JSON: {}", e),
100 });
101 }
102
103 // Try binary object
104 if let Ok(binary_data) = value.as_binary_object() {
105 return Self::from_binary(binary_data).map_err(|e| {
106 SerializationError::ValidationFailed {
107 field: field_name.to_string(),
108 message: format!("Failed to deserialize VersionedData from binary: {}", e),
109 }
110 });
111 }
112
113 Err(SerializationError::ValidationFailed {
114 field: field_name.to_string(),
115 message: format!(
116 "Expected JsonObject or BinaryObject for VersionedData, found {}",
117 value.type_name()
118 ),
119 })
120 }
121}
122
123/// Validated user input with constraints
124#[derive(Debug, Clone, PartialEq)]
125pub struct ValidatedUserInput {
126 pub username: String,
127 pub email: String,
128 pub age: u16,
129 pub preferences: HashMap<String, String>,
130}
131
132impl StructSerializable for ValidatedUserInput {
133 fn to_serializer(&self) -> StructSerializer {
134 StructSerializer::new()
135 .field("username", &self.username)
136 .field("email", &self.email)
137 .field("age", &self.age)
138 .field("preferences", &self.preferences)
139 }
140
141 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
142 let username: String = deserializer.field("username")?;
143 let email: String = deserializer.field("email")?;
144 let age: u16 = deserializer.field("age")?;
145 let preferences: HashMap<String, String> = deserializer.field("preferences")?;
146
147 // Validate username
148 if username.is_empty() || username.len() > 50 {
149 return Err(SerializationError::ValidationFailed {
150 field: "username".to_string(),
151 message: "Username must be 1-50 characters long".to_string(),
152 });
153 }
154
155 if !username
156 .chars()
157 .all(|c| c.is_alphanumeric() || c == '_' || c == '-')
158 {
159 return Err(SerializationError::ValidationFailed {
160 field: "username".to_string(),
161 message:
162 "Username can only contain alphanumeric characters, underscores, and hyphens"
163 .to_string(),
164 });
165 }
166
167 // Validate email (basic check)
168 if !email.contains('@') || !email.contains('.') || email.len() < 5 {
169 return Err(SerializationError::ValidationFailed {
170 field: "email".to_string(),
171 message: "Invalid email format".to_string(),
172 });
173 }
174
175 // Validate age
176 if !(13..=120).contains(&age) {
177 return Err(SerializationError::ValidationFailed {
178 field: "age".to_string(),
179 message: "Age must be between 13 and 120".to_string(),
180 });
181 }
182
183 // Validate preferences
184 if preferences.len() > 20 {
185 return Err(SerializationError::ValidationFailed {
186 field: "preferences".to_string(),
187 message: "Too many preferences (maximum 20)".to_string(),
188 });
189 }
190
191 for (key, value) in &preferences {
192 if key.len() > 50 || value.len() > 200 {
193 return Err(SerializationError::ValidationFailed {
194 field: "preferences".to_string(),
195 message: format!("Preference key/value too long: {}", key),
196 });
197 }
198 }
199
200 Ok(ValidatedUserInput {
201 username,
202 email,
203 age,
204 preferences,
205 })
206 }
207}
208
209impl ToFieldValue for ValidatedUserInput {
210 fn to_field_value(&self) -> FieldValue {
211 match self.to_json() {
212 Ok(json_str) => FieldValue::from_json_object(json_str),
213 Err(_) => FieldValue::from_string("serialization_error".to_string()),
214 }
215 }
216}
217
218impl FromFieldValue for ValidatedUserInput {
219 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
220 // Try JSON object first
221 if let Ok(json_data) = value.as_json_object() {
222 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
223 field: field_name.to_string(),
224 message: format!("Failed to deserialize ValidatedUserInput from JSON: {}", e),
225 });
226 }
227
228 // Try binary object
229 if let Ok(binary_data) = value.as_binary_object() {
230 return Self::from_binary(binary_data).map_err(|e| {
231 SerializationError::ValidationFailed {
232 field: field_name.to_string(),
233 message: format!(
234 "Failed to deserialize ValidatedUserInput from binary: {}",
235 e
236 ),
237 }
238 });
239 }
240
241 Err(SerializationError::ValidationFailed {
242 field: field_name.to_string(),
243 message: format!(
244 "Expected JsonObject or BinaryObject for ValidatedUserInput, found {}",
245 value.type_name()
246 ),
247 })
248 }
249}
250
251/// Recovery helper for handling partial data
252#[derive(Debug, Clone, PartialEq)]
253pub struct RecoverableData {
254 pub critical_field: String,
255 pub important_field: Option<String>,
256 pub optional_field: Option<String>,
257 pub metadata: HashMap<String, String>,
258}
259
260impl StructSerializable for RecoverableData {
261 fn to_serializer(&self) -> StructSerializer {
262 StructSerializer::new()
263 .field("critical_field", &self.critical_field)
264 .field("important_field", &self.important_field)
265 .field("optional_field", &self.optional_field)
266 .field("metadata", &self.metadata)
267 }
268
269 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
270 // Critical field - must exist
271 let critical_field = deserializer.field("critical_field")?;
272
273 // Important field - try to recover if missing
274 let important_field = deserializer.field_optional("important_field")?;
275
276 // Optional field - graceful fallback
277 let optional_field = deserializer.field_optional("optional_field")?;
278
279 // Metadata - recover what we can
280 let metadata = deserializer.field_or("metadata", HashMap::new())?;
281
282 Ok(RecoverableData {
283 critical_field,
284 important_field,
285 optional_field,
286 metadata,
287 })
288 }
289}
290
291impl ToFieldValue for RecoverableData {
292 fn to_field_value(&self) -> FieldValue {
293 match self.to_json() {
294 Ok(json_str) => FieldValue::from_json_object(json_str),
295 Err(_) => FieldValue::from_string("serialization_error".to_string()),
296 }
297 }
298}
299
300impl FromFieldValue for RecoverableData {
301 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
302 // Try JSON object first
303 if let Ok(json_data) = value.as_json_object() {
304 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
305 field: field_name.to_string(),
306 message: format!("Failed to deserialize RecoverableData from JSON: {}", e),
307 });
308 }
309
310 // Try binary object
311 if let Ok(binary_data) = value.as_binary_object() {
312 return Self::from_binary(binary_data).map_err(|e| {
313 SerializationError::ValidationFailed {
314 field: field_name.to_string(),
315 message: format!("Failed to deserialize RecoverableData from binary: {}", e),
316 }
317 });
318 }
319
320 Err(SerializationError::ValidationFailed {
321 field: field_name.to_string(),
322 message: format!(
323 "Expected JsonObject or BinaryObject for RecoverableData, found {}",
324 value.type_name()
325 ),
326 })
327 }
328}
329
330fn main() -> Result<(), Box<dyn std::error::Error>> {
331 println!("=== Error Handling and Validation Example ===\n");
332
333 demonstrate_common_error_scenarios()?;
334 demonstrate_validation_patterns()?;
335 demonstrate_schema_evolution()?;
336 demonstrate_recovery_strategies()?;
337 demonstrate_production_error_handling()?;
338 cleanup_temp_files()?;
339
340 println!("\n=== Example completed successfully! ===");
341 Ok(())
342}
343
344/// Demonstrate common serialization error scenarios
345fn demonstrate_common_error_scenarios() -> Result<(), Box<dyn std::error::Error>> {
346 println!("--- Common Error Scenarios ---");
347
348 fs::create_dir_all("temp_error_tests")?;
349
350 // Scenario 1: Corrupted JSON file
351 println!("1. Corrupted JSON File:");
352 let corrupted_json = r#"{"name": "test", "value": 42, "incomplete"#;
353 fs::write("temp_error_tests/corrupted.json", corrupted_json)?;
354
355 match VersionedData::load_json("temp_error_tests/corrupted.json") {
356 Ok(_) => println!(" Unexpected: Corrupted JSON was parsed successfully"),
357 Err(e) => println!(" Expected error: {}", e),
358 }
359
360 // Scenario 2: Missing required fields
361 println!("\n2. Missing Required Fields:");
362 let incomplete_json = r#"{"name": "test"}"#;
363 fs::write("temp_error_tests/incomplete.json", incomplete_json)?;
364
365 match VersionedData::load_json("temp_error_tests/incomplete.json") {
366 Ok(_) => println!(" Unexpected: Incomplete JSON was parsed successfully"),
367 Err(e) => println!(" Expected error: {}", e),
368 }
369
370 // Scenario 3: Type mismatches
371 println!("\n3. Type Mismatch:");
372 let type_mismatch_json = r#"{"version": "not_a_number", "name": "test", "value": 42.0}"#;
373 fs::write("temp_error_tests/type_mismatch.json", type_mismatch_json)?;
374
375 match VersionedData::load_json("temp_error_tests/type_mismatch.json") {
376 Ok(_) => println!(" Unexpected: Type mismatch was handled gracefully"),
377 Err(e) => println!(" Expected error: {}", e),
378 }
379
380 // Scenario 4: File not found
381 println!("\n4. File Not Found:");
382 match VersionedData::load_json("temp_error_tests/nonexistent.json") {
383 Ok(_) => println!(" Unexpected: Non-existent file was loaded"),
384 Err(e) => println!(" Expected error: {}", e),
385 }
386
387 // Scenario 5: Binary format mismatch
388 println!("\n5. Binary Format Mismatch:");
389 let invalid_binary = vec![0xFF, 0xFF, 0xFF, 0xFF]; // Invalid binary data
390 fs::write("temp_error_tests/invalid.bin", invalid_binary)?;
391
392 match VersionedData::load_binary("temp_error_tests/invalid.bin") {
393 Ok(_) => println!(" Unexpected: Invalid binary was parsed successfully"),
394 Err(e) => println!(" Expected error: {}", e),
395 }
396
397 // Scenario 6: Wrong format loading
398 println!("\n6. Wrong Format Loading:");
399 let valid_data = VersionedData {
400 version: 1,
401 name: "test".to_string(),
402 value: 42.0,
403 optional_field: None,
404 new_field: None,
405 };
406 valid_data.save_binary("temp_error_tests/valid.bin")?;
407
408 // Try to load binary file as JSON
409 match VersionedData::load_json("temp_error_tests/valid.bin") {
410 Ok(_) => println!(" Unexpected: Binary file was loaded as JSON"),
411 Err(e) => println!(" Expected error: {}", e),
412 }
413
414 Ok(())
415}
416
417/// Demonstrate validation patterns
418fn demonstrate_validation_patterns() -> Result<(), Box<dyn std::error::Error>> {
419 println!("\n--- Validation Patterns ---");
420
421 println!("Testing input validation with various scenarios:");
422
423 // Valid input
424 println!("\n1. Valid Input:");
425 let mut valid_preferences = HashMap::new();
426 valid_preferences.insert("theme".to_string(), "dark".to_string());
427 valid_preferences.insert("language".to_string(), "en".to_string());
428
429 let valid_input = ValidatedUserInput {
430 username: "john_doe".to_string(),
431 email: "john@example.com".to_string(),
432 age: 25,
433 preferences: valid_preferences,
434 };
435
436 match valid_input.to_json() {
437 Ok(json) => {
438 println!(" ✓ Valid input serialized successfully");
439 match ValidatedUserInput::from_json(&json) {
440 Ok(_) => println!(" ✓ Valid input deserialized successfully"),
441 Err(e) => println!(" ✗ Deserialization failed: {}", e),
442 }
443 }
444 Err(e) => println!(" ✗ Serialization failed: {}", e),
445 }
446
447 // Test validation errors
448 let validation_tests = vec![
449 (
450 "Empty username",
451 ValidatedUserInput {
452 username: "".to_string(),
453 email: "test@example.com".to_string(),
454 age: 25,
455 preferences: HashMap::new(),
456 },
457 ),
458 (
459 "Invalid username characters",
460 ValidatedUserInput {
461 username: "user@name!".to_string(),
462 email: "test@example.com".to_string(),
463 age: 25,
464 preferences: HashMap::new(),
465 },
466 ),
467 (
468 "Invalid email",
469 ValidatedUserInput {
470 username: "username".to_string(),
471 email: "invalid_email".to_string(),
472 age: 25,
473 preferences: HashMap::new(),
474 },
475 ),
476 (
477 "Age too low",
478 ValidatedUserInput {
479 username: "username".to_string(),
480 email: "test@example.com".to_string(),
481 age: 10,
482 preferences: HashMap::new(),
483 },
484 ),
485 (
486 "Age too high",
487 ValidatedUserInput {
488 username: "username".to_string(),
489 email: "test@example.com".to_string(),
490 age: 150,
491 preferences: HashMap::new(),
492 },
493 ),
494 ];
495
496 for (description, invalid_input) in validation_tests {
497 println!("\n2. {}:", description);
498 match invalid_input.to_json() {
499 Ok(json) => match ValidatedUserInput::from_json(&json) {
500 Ok(_) => println!(" ✗ Unexpected: Invalid input was accepted"),
501 Err(e) => println!(" ✓ Expected validation error: {}", e),
502 },
503 Err(e) => println!(" ✗ Serialization error: {}", e),
504 }
505 }
506
507 // Test preferences validation
508 println!("\n3. Preferences Validation:");
509 let mut too_many_preferences = HashMap::new();
510 for i in 0..25 {
511 too_many_preferences.insert(format!("pref_{}", i), "value".to_string());
512 }
513
514 let invalid_prefs_input = ValidatedUserInput {
515 username: "username".to_string(),
516 email: "test@example.com".to_string(),
517 age: 25,
518 preferences: too_many_preferences,
519 };
520
521 match invalid_prefs_input.to_json() {
522 Ok(json) => match ValidatedUserInput::from_json(&json) {
523 Ok(_) => println!(" ✗ Unexpected: Too many preferences were accepted"),
524 Err(e) => println!(" ✓ Expected validation error: {}", e),
525 },
526 Err(e) => println!(" ✗ Serialization error: {}", e),
527 }
528
529 Ok(())
530}
531
532/// Demonstrate schema evolution patterns
533fn demonstrate_schema_evolution() -> Result<(), Box<dyn std::error::Error>> {
534 println!("\n--- Schema Evolution Patterns ---");
535
536 fs::create_dir_all("temp_schema_tests")?;
537
538 // Create data with different schema versions
539 println!("Creating data with different schema versions:");
540
541 // Version 1 data (minimal)
542 let v1_json = r#"{
543 "version": 1,
544 "name": "legacy_data",
545 "value": 123.45
546 }"#;
547 fs::write("temp_schema_tests/v1_data.json", v1_json)?;
548 println!(" ✓ Version 1 data created (minimal fields)");
549
550 // Version 2 data (with optional field)
551 let v2_json = r#"{
552 "version": 2,
553 "name": "v2_data",
554 "value": 678.90,
555 "optional_field": "added_in_v2"
556 }"#;
557 fs::write("temp_schema_tests/v2_data.json", v2_json)?;
558 println!(" ✓ Version 2 data created (with optional field)");
559
560 // Version 3 data (with all fields)
561 let v3_data = VersionedData {
562 version: 3,
563 name: "v3_data".to_string(),
564 value: 999.99,
565 optional_field: Some("present".to_string()),
566 new_field: Some(42),
567 };
568 v3_data.save_json("temp_schema_tests/v3_data.json")?;
569 println!(" ✓ Version 3 data created (all fields)");
570
571 // Test backward compatibility
572 println!("\nTesting backward compatibility:");
573
574 // Load v1 data with current deserializer
575 match VersionedData::load_json("temp_schema_tests/v1_data.json") {
576 Ok(data) => {
577 println!(" ✓ V1 data loaded successfully:");
578 println!(" Name: {}", data.name);
579 println!(" Value: {}", data.value);
580 println!(" Optional field: {:?}", data.optional_field);
581 println!(" New field: {:?}", data.new_field);
582 }
583 Err(e) => println!(" ✗ Failed to load V1 data: {}", e),
584 }
585
586 // Load v2 data with current deserializer
587 match VersionedData::load_json("temp_schema_tests/v2_data.json") {
588 Ok(data) => {
589 println!(" ✓ V2 data loaded successfully:");
590 println!(" Name: {}", data.name);
591 println!(" Value: {}", data.value);
592 println!(" Optional field: {:?}", data.optional_field);
593 println!(" New field: {:?}", data.new_field);
594 }
595 Err(e) => println!(" ✗ Failed to load V2 data: {}", e),
596 }
597
598 // Test future version rejection
599 println!("\nTesting future version handling:");
600 let future_version_json = r#"{
601 "version": 99,
602 "name": "future_data",
603 "value": 123.45,
604 "unknown_field": "should_be_ignored"
605 }"#;
606 fs::write("temp_schema_tests/future_data.json", future_version_json)?;
607
608 match VersionedData::load_json("temp_schema_tests/future_data.json") {
609 Ok(_) => println!(" ✗ Unexpected: Future version was accepted"),
610 Err(e) => println!(" ✓ Expected rejection of future version: {}", e),
611 }
612
613 // Demonstrate migration strategy
614 println!("\nDemonstrating migration strategy:");
615 println!(" Strategy: Load old format, upgrade to new format, save");
616
617 // Simulate migrating v1 data to v3 format
618 let v1_loaded = VersionedData::load_json("temp_schema_tests/v1_data.json")?;
619 let v1_upgraded = VersionedData {
620 version: 3,
621 name: v1_loaded.name,
622 value: v1_loaded.value,
623 optional_field: Some("migrated_default".to_string()),
624 new_field: Some(0),
625 };
626
627 v1_upgraded.save_json("temp_schema_tests/v1_migrated.json")?;
628 println!(" ✓ V1 data migrated to V3 format");
629
630 Ok(())
631}
632
633/// Demonstrate recovery strategies
634fn demonstrate_recovery_strategies() -> Result<(), Box<dyn std::error::Error>> {
635 println!("\n--- Recovery Strategies ---");
636
637 fs::create_dir_all("temp_recovery_tests")?;
638
639 // Strategy 1: Graceful degradation
640 println!("1. Graceful Degradation Strategy:");
641
642 // Create complete data
643 let complete_data = RecoverableData {
644 critical_field: "essential_info".to_string(),
645 important_field: Some("important_info".to_string()),
646 optional_field: Some("nice_to_have".to_string()),
647 metadata: {
648 let mut map = HashMap::new();
649 map.insert("key1".to_string(), "value1".to_string());
650 map.insert("key2".to_string(), "value2".to_string());
651 map
652 },
653 };
654
655 // Save complete data
656 complete_data.save_json("temp_recovery_tests/complete.json")?;
657
658 // Create partial data (missing some fields)
659 let partial_json = r#"{
660 "critical_field": "essential_info",
661 "optional_field": "nice_to_have"
662 }"#;
663 fs::write("temp_recovery_tests/partial.json", partial_json)?;
664
665 // Load partial data and demonstrate recovery
666 match RecoverableData::load_json("temp_recovery_tests/partial.json") {
667 Ok(recovered) => {
668 println!(" ✓ Partial data recovered successfully:");
669 println!(" Critical field: {}", recovered.critical_field);
670 println!(
671 " Important field: {:?} (missing, set to None)",
672 recovered.important_field
673 );
674 println!(" Optional field: {:?}", recovered.optional_field);
675 println!(
676 " Metadata: {} entries (defaulted to empty)",
677 recovered.metadata.len()
678 );
679 }
680 Err(e) => println!(" ✗ Recovery failed: {}", e),
681 }
682
683 // Strategy 2: Error context preservation
684 println!("\n2. Error Context Preservation:");
685
686 let malformed_json = r#"{
687 "critical_field": "essential_info",
688 "important_field": 12345,
689 "metadata": "not_a_map"
690 }"#;
691 fs::write("temp_recovery_tests/malformed.json", malformed_json)?;
692
693 match RecoverableData::load_json("temp_recovery_tests/malformed.json") {
694 Ok(_) => println!(" ✗ Unexpected: Malformed data was accepted"),
695 Err(e) => {
696 println!(" ✓ Error context preserved:");
697 println!(" Error: {}", e);
698 println!(" Error type: {:?}", std::mem::discriminant(&e));
699 }
700 }
701
702 // Strategy 3: Fallback data sources
703 println!("\n3. Fallback Data Sources:");
704
705 // Primary source (corrupted)
706 let corrupted_primary = "corrupted data";
707 fs::write("temp_recovery_tests/primary.json", corrupted_primary)?;
708
709 // Backup source (valid)
710 let backup_data = RecoverableData {
711 critical_field: "backup_critical".to_string(),
712 important_field: Some("backup_important".to_string()),
713 optional_field: None,
714 metadata: HashMap::new(),
715 };
716 backup_data.save_json("temp_recovery_tests/backup.json")?;
717
718 // Default fallback
719 let default_data = RecoverableData {
720 critical_field: "default_critical".to_string(),
721 important_field: None,
722 optional_field: None,
723 metadata: HashMap::new(),
724 };
725
726 println!(" Attempting to load data with fallback chain:");
727
728 // Try primary source
729 let loaded_data = match RecoverableData::load_json("temp_recovery_tests/primary.json") {
730 Ok(data) => {
731 println!(" ✓ Loaded from primary source");
732 data
733 }
734 Err(_) => {
735 println!(" ✗ Primary source failed, trying backup");
736
737 // Try backup source
738 match RecoverableData::load_json("temp_recovery_tests/backup.json") {
739 Ok(data) => {
740 println!(" ✓ Loaded from backup source");
741 data
742 }
743 Err(_) => {
744 println!(" ✗ Backup source failed, using default");
745 default_data
746 }
747 }
748 }
749 };
750
751 println!(" Final loaded data:");
752 println!(" Critical field: {}", loaded_data.critical_field);
753
754 Ok(())
755}
756
757/// Demonstrate production-ready error handling
758fn demonstrate_production_error_handling() -> Result<(), Box<dyn std::error::Error>> {
759 println!("\n--- Production Error Handling ---");
760
761 fs::create_dir_all("temp_production_tests")?;
762
763 // Error logging and monitoring
764 println!("1. Error Logging and Monitoring:");
765
766 let test_data = VersionedData {
767 version: 2,
768 name: "production_test".to_string(),
769 value: 42.0,
770 optional_field: Some("test".to_string()),
771 new_field: None,
772 };
773
774 // Create error log for demonstration
775 let mut error_log = fs::OpenOptions::new()
776 .create(true)
777 .append(true)
778 .open("temp_production_tests/error.log")?;
779
780 // Function to log errors in production format
781 let mut log_error =
782 |error: &SerializationError, context: &str| -> Result<(), Box<dyn std::error::Error>> {
783 let timestamp = std::time::SystemTime::now()
784 .duration_since(std::time::UNIX_EPOCH)?
785 .as_secs();
786
787 writeln!(error_log, "[{}] ERROR in {}: {}", timestamp, context, error)?;
788 Ok(())
789 };
790
791 // Simulate various error scenarios with logging
792 let error_scenarios = vec![
793 ("corrupted_file.json", "invalid json content"),
794 ("missing_fields.json", r#"{"version": 1}"#),
795 (
796 "type_error.json",
797 r#"{"version": "not_number", "name": "test", "value": 42.0}"#,
798 ),
799 ];
800
801 for (filename, content) in error_scenarios {
802 let filepath = format!("temp_production_tests/{}", filename);
803 fs::write(&filepath, content)?;
804
805 match VersionedData::load_json(&filepath) {
806 Ok(_) => println!(" ✗ Unexpected success for {}", filename),
807 Err(e) => {
808 log_error(&e, &format!("load_config({})", filename))?;
809 println!(" ✓ Error logged for {}: {}", filename, e);
810 }
811 }
812 }
813
814 // Health check pattern
815 println!("\n2. Health Check Pattern:");
816
817 let health_check = || -> Result<bool, SerializationError> {
818 // Check if we can serialize/deserialize basic data
819 let test_data = VersionedData {
820 version: 1,
821 name: "health_check".to_string(),
822 value: 1.0,
823 optional_field: None,
824 new_field: None,
825 };
826
827 let serialized = test_data.to_json()?;
828 let _deserialized = VersionedData::from_json(&serialized)?;
829 Ok(true)
830 };
831
832 match health_check() {
833 Ok(_) => println!(" ✓ Serialization system health check passed"),
834 Err(e) => {
835 log_error(&e, "health_check")?;
836 println!(" ✗ Serialization system health check failed: {}", e);
837 }
838 }
839
840 // Circuit breaker pattern simulation
841 println!("\n3. Circuit Breaker Pattern:");
842
843 struct CircuitBreaker {
844 failure_count: u32,
845 failure_threshold: u32,
846 is_open: bool,
847 }
848
849 impl CircuitBreaker {
850 fn new(threshold: u32) -> Self {
851 Self {
852 failure_count: 0,
853 failure_threshold: threshold,
854 is_open: false,
855 }
856 }
857
858 fn call<F, T>(&mut self, operation: F) -> Result<T, String>
859 where
860 F: FnOnce() -> Result<T, SerializationError>,
861 {
862 if self.is_open {
863 return Err("Circuit breaker is open".to_string());
864 }
865
866 match operation() {
867 Ok(result) => {
868 self.failure_count = 0; // Reset on success
869 Ok(result)
870 }
871 Err(e) => {
872 self.failure_count += 1;
873 if self.failure_count >= self.failure_threshold {
874 self.is_open = true;
875 println!(
876 " Circuit breaker opened after {} failures",
877 self.failure_count
878 );
879 }
880 Err(e.to_string())
881 }
882 }
883 }
884 }
885
886 let mut circuit_breaker = CircuitBreaker::new(3);
887
888 // Simulate operations that fail
889 for i in 1..=5 {
890 let result = circuit_breaker
891 .call(|| VersionedData::load_json("temp_production_tests/corrupted_file.json"));
892
893 match result {
894 Ok(_) => println!(" Operation {} succeeded", i),
895 Err(e) => println!(" Operation {} failed: {}", i, e),
896 }
897 }
898
899 // Retry mechanism
900 println!("\n4. Retry Mechanism:");
901
902 let retry_operation = |max_attempts: u32| -> Result<VersionedData, String> {
903 for attempt in 1..=max_attempts {
904 println!(" Attempt {}/{}", attempt, max_attempts);
905
906 // Try different sources in order
907 let sources = vec![
908 "temp_production_tests/corrupted_file.json",
909 "temp_production_tests/missing_fields.json",
910 "temp_production_tests/backup_valid.json",
911 ];
912
913 if attempt == max_attempts {
914 // On final attempt, create valid backup
915 test_data
916 .save_json("temp_production_tests/backup_valid.json")
917 .map_err(|e| format!("Failed to create backup: {}", e))?;
918 }
919
920 for source in &sources {
921 match VersionedData::load_json(source) {
922 Ok(data) => {
923 println!(" ✓ Succeeded loading from {}", source);
924 return Ok(data);
925 }
926 Err(_) => {
927 println!(" ✗ Failed to load from {}", source);
928 continue;
929 }
930 }
931 }
932
933 if attempt < max_attempts {
934 println!(" Waiting before retry...");
935 // In real code, would sleep here
936 }
937 }
938
939 Err("All retry attempts exhausted".to_string())
940 };
941
942 match retry_operation(3) {
943 Ok(data) => println!(" ✓ Retry succeeded: {}", data.name),
944 Err(e) => println!(" ✗ Retry failed: {}", e),
945 }
946
947 Ok(())
948}More examples
78 fn to_field_value(&self) -> FieldValue {
79 match self.to_json() {
80 Ok(json_str) => FieldValue::from_json_object(json_str),
81 Err(_) => FieldValue::from_string("serialization_error".to_string()),
82 }
83 }
84}
85
86impl FromFieldValue for PerformanceMetrics {
87 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
88 // Try JSON object first
89 if let Ok(json_data) = value.as_json_object() {
90 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
91 field: field_name.to_string(),
92 message: format!("Failed to deserialize PerformanceMetrics from JSON: {}", e),
93 });
94 }
95
96 // Try binary object
97 if let Ok(binary_data) = value.as_binary_object() {
98 return Self::from_binary(binary_data).map_err(|e| {
99 SerializationError::ValidationFailed {
100 field: field_name.to_string(),
101 message: format!(
102 "Failed to deserialize PerformanceMetrics from binary: {}",
103 e
104 ),
105 }
106 });
107 }
108
109 Err(SerializationError::ValidationFailed {
110 field: field_name.to_string(),
111 message: format!(
112 "Expected JsonObject or BinaryObject for PerformanceMetrics, found {}",
113 value.type_name()
114 ),
115 })
116 }
117}
118
119/// Large dataset for performance testing
120#[derive(Debug, Clone, PartialEq)]
121pub struct LargeDataset {
122 pub name: String,
123 pub values: Vec<f32>, // Changed from f64 to f32 (supported)
124 pub labels: Vec<String>,
125 pub feature_count: usize, // Simplified from Vec<Vec<f32>> to just a count
126 pub feature_dimension: usize, // Store dimensions separately
127 pub metadata: HashMap<String, String>,
128 pub timestamp_count: usize, // Simplified from Vec<u64> to just count
129}
130
131impl StructSerializable for LargeDataset {
132 fn to_serializer(&self) -> StructSerializer {
133 StructSerializer::new()
134 .field("name", &self.name)
135 .field("values", &self.values)
136 .field("labels", &self.labels)
137 .field("feature_count", &self.feature_count)
138 .field("feature_dimension", &self.feature_dimension)
139 .field("metadata", &self.metadata)
140 .field("timestamp_count", &self.timestamp_count)
141 }
142
143 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
144 let name = deserializer.field("name")?;
145 let values = deserializer.field("values")?;
146 let labels = deserializer.field("labels")?;
147 let feature_count = deserializer.field("feature_count")?;
148 let feature_dimension = deserializer.field("feature_dimension")?;
149 let metadata = deserializer.field("metadata")?;
150 let timestamp_count = deserializer.field("timestamp_count")?;
151
152 Ok(LargeDataset {
153 name,
154 values,
155 labels,
156 feature_count,
157 feature_dimension,
158 metadata,
159 timestamp_count,
160 })
161 }
162}
163
164impl ToFieldValue for LargeDataset {
165 fn to_field_value(&self) -> FieldValue {
166 match self.to_json() {
167 Ok(json_str) => FieldValue::from_json_object(json_str),
168 Err(_) => FieldValue::from_string("serialization_error".to_string()),
169 }
170 }
171}
172
173impl FromFieldValue for LargeDataset {
174 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
175 // Try JSON object first
176 if let Ok(json_data) = value.as_json_object() {
177 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
178 field: field_name.to_string(),
179 message: format!("Failed to deserialize LargeDataset from JSON: {}", e),
180 });
181 }
182
183 // Try binary object
184 if let Ok(binary_data) = value.as_binary_object() {
185 return Self::from_binary(binary_data).map_err(|e| {
186 SerializationError::ValidationFailed {
187 field: field_name.to_string(),
188 message: format!("Failed to deserialize LargeDataset from binary: {}", e),
189 }
190 });
191 }
192
193 Err(SerializationError::ValidationFailed {
194 field: field_name.to_string(),
195 message: format!(
196 "Expected JsonObject or BinaryObject for LargeDataset, found {}",
197 value.type_name()
198 ),
199 })
200 }
201}
202
203/// Configuration data (typical JSON use case)
204#[derive(Debug, Clone, PartialEq)]
205pub struct Configuration {
206 pub version: String,
207 pub debug_enabled: bool,
208 pub log_level: String,
209 pub database_settings: HashMap<String, String>,
210 pub feature_flags_enabled: bool, // Simplified from HashMap<String, bool>
211 pub max_connections: f32, // Simplified from HashMap<String, f64>
212 pub timeout_seconds: f32,
213}
214
215impl StructSerializable for Configuration {
216 fn to_serializer(&self) -> StructSerializer {
217 StructSerializer::new()
218 .field("version", &self.version)
219 .field("debug_enabled", &self.debug_enabled)
220 .field("log_level", &self.log_level)
221 .field("database_settings", &self.database_settings)
222 .field("feature_flags_enabled", &self.feature_flags_enabled)
223 .field("max_connections", &self.max_connections)
224 .field("timeout_seconds", &self.timeout_seconds)
225 }
226
227 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
228 let version = deserializer.field("version")?;
229 let debug_enabled = deserializer.field("debug_enabled")?;
230 let log_level = deserializer.field("log_level")?;
231 let database_settings = deserializer.field("database_settings")?;
232 let feature_flags_enabled = deserializer.field("feature_flags_enabled")?;
233 let max_connections = deserializer.field("max_connections")?;
234 let timeout_seconds = deserializer.field("timeout_seconds")?;
235
236 Ok(Configuration {
237 version,
238 debug_enabled,
239 log_level,
240 database_settings,
241 feature_flags_enabled,
242 max_connections,
243 timeout_seconds,
244 })
245 }
246}
247
248impl ToFieldValue for Configuration {
249 fn to_field_value(&self) -> FieldValue {
250 match self.to_json() {
251 Ok(json_str) => FieldValue::from_json_object(json_str),
252 Err(_) => FieldValue::from_string("serialization_error".to_string()),
253 }
254 }
255}
256
257impl FromFieldValue for Configuration {
258 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
259 // Try JSON object first
260 if let Ok(json_data) = value.as_json_object() {
261 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
262 field: field_name.to_string(),
263 message: format!("Failed to deserialize Configuration from JSON: {}", e),
264 });
265 }
266
267 // Try binary object
268 if let Ok(binary_data) = value.as_binary_object() {
269 return Self::from_binary(binary_data).map_err(|e| {
270 SerializationError::ValidationFailed {
271 field: field_name.to_string(),
272 message: format!("Failed to deserialize Configuration from binary: {}", e),
273 }
274 });
275 }
276
277 Err(SerializationError::ValidationFailed {
278 field: field_name.to_string(),
279 message: format!(
280 "Expected JsonObject or BinaryObject for Configuration, found {}",
281 value.type_name()
282 ),
283 })
284 }
285}
286
287/// Format comparison results
288#[derive(Debug)]
289pub struct FormatComparison {
290 pub data_type: String,
291 pub json_size_bytes: u64,
292 pub binary_size_bytes: u64,
293 pub json_serialize_micros: u64,
294 pub binary_serialize_micros: u64,
295 pub json_deserialize_micros: u64,
296 pub binary_deserialize_micros: u64,
297 pub size_ratio: f64,
298 pub serialize_speed_ratio: f64,
299 pub deserialize_speed_ratio: f64,
300}
301
302impl FormatComparison {
303 fn new(data_type: String) -> Self {
304 Self {
305 data_type,
306 json_size_bytes: 0,
307 binary_size_bytes: 0,
308 json_serialize_micros: 0,
309 binary_serialize_micros: 0,
310 json_deserialize_micros: 0,
311 binary_deserialize_micros: 0,
312 size_ratio: 0.0,
313 serialize_speed_ratio: 0.0,
314 deserialize_speed_ratio: 0.0,
315 }
316 }
317
318 fn calculate_ratios(&mut self) {
319 self.size_ratio = self.json_size_bytes as f64 / self.binary_size_bytes as f64;
320 self.serialize_speed_ratio =
321 self.binary_serialize_micros as f64 / self.json_serialize_micros as f64;
322 self.deserialize_speed_ratio =
323 self.binary_deserialize_micros as f64 / self.json_deserialize_micros as f64;
324 }
325}
326
327fn main() -> Result<(), Box<dyn std::error::Error>> {
328 println!("=== JSON vs Binary Format Comparison Example ===\n");
329
330 demonstrate_format_characteristics()?;
331 demonstrate_size_comparisons()?;
332 demonstrate_performance_benchmarks()?;
333 demonstrate_use_case_recommendations()?;
334 demonstrate_debugging_capabilities()?;
335 cleanup_temp_files()?;
336
337 println!("\n=== Example completed successfully! ===");
338 Ok(())
339}
340
341/// Demonstrate basic format characteristics
342fn demonstrate_format_characteristics() -> Result<(), Box<dyn std::error::Error>> {
343 println!("--- Format Characteristics ---");
344
345 // Create sample data structures
346 let mut metadata = HashMap::new();
347 metadata.insert("operation_type".to_string(), "benchmark".to_string());
348 metadata.insert("system".to_string(), "train_station".to_string());
349
350 let metrics = PerformanceMetrics {
351 operation: "tensor_multiplication".to_string(),
352 duration_micros: 1234,
353 memory_usage_bytes: 8192,
354 cpu_usage_percent: 75.5,
355 throughput_ops_per_sec: 1000.0,
356 metadata,
357 };
358
359 println!("Format characteristics analysis:");
360
361 // JSON characteristics
362 let json_data = metrics.to_json()?;
363 let json_lines = json_data.lines().count();
364 let json_chars = json_data.chars().count();
365
366 println!("\nJSON Format:");
367 println!(" Size: {} bytes", json_data.len());
368 println!(" Characters: {}", json_chars);
369 println!(" Lines: {}", json_lines);
370 println!(" Human readable: Yes");
371 println!(" Self-describing: Yes");
372 println!(" Cross-platform: Yes");
373 println!(" Compression ratio: Variable (depends on content)");
374
375 // Show sample JSON output
376 println!(" Sample output:");
377 for line in json_data.lines().take(3) {
378 println!(" {}", line);
379 }
380 if json_lines > 3 {
381 println!(" ... ({} more lines)", json_lines - 3);
382 }
383
384 // Binary characteristics
385 let binary_data = metrics.to_binary()?;
386
387 println!("\nBinary Format:");
388 println!(" Size: {} bytes", binary_data.len());
389 println!(" Human readable: No");
390 println!(" Self-describing: No (requires schema)");
391 println!(" Cross-platform: Yes (with proper endianness handling)");
392 println!(" Compression ratio: High (efficient encoding)");
393
394 // Show sample binary output (hex)
395 println!(" Sample output (first 32 bytes as hex):");
396 print!(" ");
397 for (i, byte) in binary_data.iter().take(32).enumerate() {
398 if i > 0 && i % 16 == 0 {
399 println!();
400 print!(" ");
401 }
402 print!("{:02x} ", byte);
403 }
404 if binary_data.len() > 32 {
405 println!("\n ... ({} more bytes)", binary_data.len() - 32);
406 } else {
407 println!();
408 }
409
410 // Verify roundtrip for both formats
411 let json_parsed = PerformanceMetrics::from_json(&json_data)?;
412 let binary_parsed = PerformanceMetrics::from_binary(&binary_data)?;
413
414 assert_eq!(metrics, json_parsed);
415 assert_eq!(metrics, binary_parsed);
416 println!("\nRoundtrip verification: PASSED");
417
418 Ok(())
419}
420
421/// Demonstrate size comparisons across different data types
422fn demonstrate_size_comparisons() -> Result<(), Box<dyn std::error::Error>> {
423 println!("\n--- Size Comparison Analysis ---");
424
425 // Test 1: Small configuration data (typical JSON use case)
426 let mut db_settings = HashMap::new();
427 db_settings.insert("host".to_string(), "localhost".to_string());
428 db_settings.insert("port".to_string(), "5432".to_string());
429 db_settings.insert("database".to_string(), "myapp".to_string());
430
431 let config = Configuration {
432 version: "1.2.3".to_string(),
433 debug_enabled: true,
434 log_level: "info".to_string(),
435 database_settings: db_settings,
436 feature_flags_enabled: true,
437 max_connections: 100.0,
438 timeout_seconds: 30.0,
439 };
440
441 // Test 2: Large numeric dataset (typical binary use case)
442 let large_dataset = LargeDataset {
443 name: "ML Training Data".to_string(),
444 values: (0..1000).map(|i| i as f32 * 0.1).collect(),
445 labels: (0..1000).map(|i| format!("label_{}", i)).collect(),
446 feature_count: 100,
447 feature_dimension: 50,
448 timestamp_count: 1000,
449 metadata: HashMap::new(),
450 };
451
452 println!("Size comparison results:");
453
454 // Configuration comparison
455 let config_json = config.to_json()?;
456 let config_binary = config.to_binary()?;
457
458 println!("\nConfiguration Data (small, text-heavy):");
459 println!(" JSON: {} bytes", config_json.len());
460 println!(" Binary: {} bytes", config_binary.len());
461 println!(
462 " Ratio (JSON/Binary): {:.2}x",
463 config_json.len() as f64 / config_binary.len() as f64
464 );
465 println!(" Recommendation: JSON (human readable, small size difference)");
466
467 // Large dataset comparison
468 let dataset_json = large_dataset.to_json()?;
469 let dataset_binary = large_dataset.to_binary()?;
470
471 println!("\nLarge Numeric Dataset (1000 values, 100x50 matrix):");
472 println!(
473 " JSON: {} bytes ({:.1} KB)",
474 dataset_json.len(),
475 dataset_json.len() as f64 / 1024.0
476 );
477 println!(
478 " Binary: {} bytes ({:.1} KB)",
479 dataset_binary.len(),
480 dataset_binary.len() as f64 / 1024.0
481 );
482 println!(
483 " Ratio (JSON/Binary): {:.2}x",
484 dataset_json.len() as f64 / dataset_binary.len() as f64
485 );
486 if dataset_json.len() > dataset_binary.len() {
487 println!(
488 " Space saved with binary: {} bytes ({:.1} KB)",
489 dataset_json.len() - dataset_binary.len(),
490 (dataset_json.len() - dataset_binary.len()) as f64 / 1024.0
491 );
492 println!(" Recommendation: Binary (significant size reduction)");
493 } else {
494 println!(
495 " Binary overhead: {} bytes ({:.1} KB)",
496 dataset_binary.len() - dataset_json.len(),
497 (dataset_binary.len() - dataset_json.len()) as f64 / 1024.0
498 );
499 println!(" Recommendation: JSON (binary overhead not justified for this size)");
500 }
501
502 // Content analysis
503 println!("\nContent Type Analysis:");
504
505 // Analyze JSON content patterns
506 let json_numbers = dataset_json.matches(char::is_numeric).count();
507 let json_brackets = dataset_json.matches('[').count() + dataset_json.matches(']').count();
508 let json_quotes = dataset_json.matches('"').count();
509
510 println!(" JSON overhead sources:");
511 println!(" Numeric characters: ~{}", json_numbers);
512 println!(" Brackets and commas: ~{}", json_brackets);
513 println!(" Quote marks: {}", json_quotes);
514 println!(" Formatting/whitespace: Varies");
515
516 println!(" Binary advantages:");
517 println!(" Direct numeric encoding: 4-8 bytes per number");
518 println!(" No formatting overhead: Zero bytes");
519 println!(" Efficient length encoding: Minimal bytes");
520
521 Ok(())
522}
523
524/// Demonstrate performance benchmarks
525fn demonstrate_performance_benchmarks() -> Result<(), Box<dyn std::error::Error>> {
526 println!("\n--- Performance Benchmark Analysis ---");
527
528 // Create test data of varying sizes
529 let small_config = Configuration {
530 version: "1.0.0".to_string(),
531 debug_enabled: false,
532 log_level: "warn".to_string(),
533 database_settings: HashMap::new(),
534 feature_flags_enabled: false,
535 max_connections: 100.0,
536 timeout_seconds: 30.0,
537 };
538
539 let large_dataset = LargeDataset {
540 name: "Large Dataset".to_string(),
541 values: (0..5000).map(|i| i as f32 * 0.001).collect(),
542 labels: (0..5000).map(|i| format!("large_item_{}", i)).collect(),
543 feature_count: 200,
544 feature_dimension: 25,
545 timestamp_count: 5000,
546 metadata: HashMap::new(),
547 };
548
549 println!("Performance benchmark results:");
550
551 // Benchmark each dataset (avoiding trait objects due to object safety)
552 let dataset_names = ["Small Config", "Large Dataset"];
553
554 for (i, name) in dataset_names.iter().enumerate() {
555 let mut comparison = FormatComparison::new(name.to_string());
556
557 // JSON serialization benchmark
558 let start = Instant::now();
559 let json_data = match i {
560 0 => small_config.to_json()?,
561 _ => large_dataset.to_json()?,
562 };
563 comparison.json_serialize_micros = start.elapsed().as_micros() as u64;
564 comparison.json_size_bytes = json_data.len() as u64;
565
566 // JSON deserialization benchmark (using PerformanceMetrics as example)
567 if *name == "Small Config" {
568 let start = Instant::now();
569 let _parsed = Configuration::from_json(&json_data)?;
570 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
571 } else {
572 let start = Instant::now();
573 let _parsed = LargeDataset::from_json(&json_data)?;
574 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
575 }
576
577 // Binary serialization benchmark
578 let start = Instant::now();
579 let binary_data = match i {
580 0 => small_config.to_binary()?,
581 _ => large_dataset.to_binary()?,
582 };
583 comparison.binary_serialize_micros = start.elapsed().as_micros() as u64;
584 comparison.binary_size_bytes = binary_data.len() as u64;
585
586 // Binary deserialization benchmark
587 if *name == "Small Config" {
588 let start = Instant::now();
589 let _parsed = Configuration::from_binary(&binary_data)?;
590 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
591 } else {
592 let start = Instant::now();
593 let _parsed = LargeDataset::from_binary(&binary_data)?;
594 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
595 }
596
597 // Calculate ratios
598 comparison.calculate_ratios();
599
600 // Display results
601 println!("\n{}:", name);
602 println!(
603 " Size - JSON: {} bytes, Binary: {} bytes (ratio: {:.2}x)",
604 comparison.json_size_bytes, comparison.binary_size_bytes, comparison.size_ratio
605 );
606 println!(
607 " Serialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
608 comparison.json_serialize_micros,
609 comparison.binary_serialize_micros,
610 comparison.serialize_speed_ratio
611 );
612 println!(
613 " Deserialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
614 comparison.json_deserialize_micros,
615 comparison.binary_deserialize_micros,
616 comparison.deserialize_speed_ratio
617 );
618 }
619
620 println!("\nPerformance Summary:");
621 println!(" - Binary format consistently uses less storage space");
622 println!(" - Performance differences vary by data type and size");
623 println!(" - Larger datasets show more significant binary advantages");
624 println!(" - JSON parsing overhead increases with structure complexity");
625
626 Ok(())
627}
628
629/// Demonstrate use case recommendations
630fn demonstrate_use_case_recommendations() -> Result<(), Box<dyn std::error::Error>> {
631 println!("\n--- Use Case Recommendations ---");
632
633 println!("JSON Format - Recommended for:");
634 println!(" ✓ Configuration files (human-editable)");
635 println!(" ✓ API responses (web compatibility)");
636 println!(" ✓ Debugging and development (readability)");
637 println!(" ✓ Small data structures (minimal overhead)");
638 println!(" ✓ Cross-language interoperability");
639 println!(" ✓ Schema evolution (self-describing)");
640 println!(" ✓ Text-heavy data with few numbers");
641
642 println!("\nBinary Format - Recommended for:");
643 println!(" ✓ Large datasets (memory/storage efficiency)");
644 println!(" ✓ High-performance applications (speed critical)");
645 println!(" ✓ Numeric-heavy data (ML models, matrices)");
646 println!(" ✓ Network transmission (bandwidth limited)");
647 println!(" ✓ Embedded systems (resource constrained)");
648 println!(" ✓ Long-term storage (space efficiency)");
649 println!(" ✓ Frequent serialization/deserialization");
650
651 // Demonstrate decision matrix
652 println!("\nDecision Matrix Example:");
653
654 let scenarios = vec![
655 (
656 "Web API Configuration",
657 "JSON",
658 "Human readable, web standard, small size",
659 ),
660 (
661 "ML Model Weights",
662 "Binary",
663 "Large numeric data, performance critical",
664 ),
665 (
666 "User Preferences",
667 "JSON",
668 "Human editable, self-documenting",
669 ),
670 (
671 "Real-time Telemetry",
672 "Binary",
673 "High frequency, bandwidth limited",
674 ),
675 (
676 "Application Settings",
677 "JSON",
678 "Developer accessible, version control friendly",
679 ),
680 (
681 "Scientific Dataset",
682 "Binary",
683 "Large arrays, storage efficiency critical",
684 ),
685 ];
686
687 for (scenario, recommendation, reason) in scenarios {
688 println!(" {} -> {} ({})", scenario, recommendation, reason);
689 }
690
691 // Create examples for common scenarios
692 println!("\nPractical Examples:");
693
694 // Configuration file example (JSON)
695 let config = Configuration {
696 version: "2.1.0".to_string(),
697 debug_enabled: false,
698 log_level: "info".to_string(),
699 database_settings: {
700 let mut map = HashMap::new();
701 map.insert("url".to_string(), "postgresql://localhost/app".to_string());
702 map.insert("pool_size".to_string(), "10".to_string());
703 map
704 },
705 feature_flags_enabled: true,
706 max_connections: 100.0,
707 timeout_seconds: 30.0,
708 };
709
710 config.save_json("temp_config_example.json")?;
711 let config_content = fs::read_to_string("temp_config_example.json")?;
712
713 println!("\nConfiguration File (JSON) - Human readable:");
714 for line in config_content.lines().take(5) {
715 println!(" {}", line);
716 }
717 println!(" ... (easily editable by developers)");
718
719 // Data export example (Binary)
720 let export_data = LargeDataset {
721 name: "Training Export".to_string(),
722 values: (0..1000).map(|i| (i as f32).sin()).collect(),
723 labels: (0..1000).map(|i| format!("sample_{:04}", i)).collect(),
724 feature_count: 50,
725 feature_dimension: 20,
726 timestamp_count: 1000,
727 metadata: HashMap::new(),
728 };
729
730 export_data.save_binary("temp_export_example.bin")?;
731 let export_size = fs::metadata("temp_export_example.bin")?.len();
732
733 println!("\nData Export (Binary) - Efficient storage:");
734 println!(
735 " File size: {} bytes ({:.1} KB)",
736 export_size,
737 export_size as f64 / 1024.0
738 );
739 println!(" 1000 numeric values + 50x20 matrix + metadata");
740 println!(" Compact encoding saves significant space vs JSON");
741
742 Ok(())
743}
744
745/// Demonstrate debugging capabilities
746fn demonstrate_debugging_capabilities() -> Result<(), Box<dyn std::error::Error>> {
747 println!("\n--- Debugging Capabilities ---");
748
749 let mut metadata = HashMap::new();
750 metadata.insert("debug_session".to_string(), "session_123".to_string());
751 metadata.insert("error_code".to_string(), "E001".to_string());
752
753 let debug_metrics = PerformanceMetrics {
754 operation: "debug_test".to_string(),
755 duration_micros: 5432,
756 memory_usage_bytes: 16384,
757 cpu_usage_percent: 42.7,
758 throughput_ops_per_sec: 750.0,
759 metadata,
760 };
761
762 println!("Debugging Comparison:");
763
764 // JSON debugging advantages
765 let json_data = debug_metrics.to_json()?;
766 println!("\nJSON Format - Debugging Advantages:");
767 println!(" ✓ Human readable without tools");
768 println!(" ✓ Can inspect values directly");
769 println!(" ✓ Text editors show structure");
770 println!(" ✓ Diff tools work naturally");
771 println!(" ✓ Version control friendly");
772
773 println!("\n Sample JSON output for debugging:");
774 for (i, line) in json_data.lines().enumerate() {
775 if i < 5 {
776 println!(" {}", line);
777 }
778 }
779
780 // Binary debugging limitations
781 let binary_data = debug_metrics.to_binary()?;
782 println!("\nBinary Format - Debugging Limitations:");
783 println!(" ✗ Requires special tools to inspect");
784 println!(" ✗ Not human readable");
785 println!(" ✗ Difficult to debug data corruption");
786 println!(" ✗ Version control shows as binary diff");
787
788 println!("\n Binary data (hex dump for debugging):");
789 print!(" ");
790 for (i, byte) in binary_data.iter().take(40).enumerate() {
791 if i > 0 && i % 16 == 0 {
792 println!();
793 print!(" ");
794 }
795 print!("{:02x} ", byte);
796 }
797 println!("\n (requires hex editor or custom tools)");
798
799 // Development workflow comparison
800 println!("\nDevelopment Workflow Impact:");
801
802 println!("\nJSON Workflow:");
803 println!(" 1. Save data to JSON file");
804 println!(" 2. Open in any text editor");
805 println!(" 3. Inspect values directly");
806 println!(" 4. Make manual edits if needed");
807 println!(" 5. Version control tracks changes");
808
809 println!("\nBinary Workflow:");
810 println!(" 1. Save data to binary file");
811 println!(" 2. Write debugging code to load and print");
812 println!(" 3. Use hex editor for low-level inspection");
813 println!(" 4. Cannot make manual edits easily");
814 println!(" 5. Version control shows binary changes only");
815
816 // Hybrid approach recommendation
817 println!("\nHybrid Approach for Development:");
818 println!(" - Use JSON during development/debugging");
819 println!(" - Switch to binary for production deployment");
820 println!(" - Provide debugging tools that export binary to JSON");
821 println!(" - Include format conversion utilities");
822
823 // Demonstrate debugging scenario
824 println!("\nDebugging Scenario Example:");
825 println!(" Problem: Performance metrics show unexpected values");
826
827 // Save both formats for comparison
828 debug_metrics.save_json("temp_debug_metrics.json")?;
829 debug_metrics.save_binary("temp_debug_metrics.bin")?;
830
831 println!(" JSON approach: Open temp_debug_metrics.json in editor");
832 println!(" -> Immediately see cpu_usage_percent: 42.7");
833 println!(" -> Compare with expected range");
834 println!(" -> Check metadata for debug_session: 'session_123'");
835
836 println!(" Binary approach: Write debugging code");
837 println!(" -> Load binary file programmatically");
838 println!(" -> Print values to console");
839 println!(" -> Additional development time required");
840
841 Ok(())
842}74 fn to_field_value(&self) -> FieldValue {
75 match self.to_json() {
76 Ok(json_str) => FieldValue::from_json_object(json_str),
77 Err(_) => FieldValue::from_string("serialization_error".to_string()),
78 }
79 }
80}
81
82impl FromFieldValue for ContactInfo {
83 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
84 // Try JSON object first
85 if let Ok(json_data) = value.as_json_object() {
86 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
87 field: field_name.to_string(),
88 message: format!("Failed to deserialize ContactInfo from JSON: {}", e),
89 });
90 }
91
92 // Try binary object
93 if let Ok(binary_data) = value.as_binary_object() {
94 return Self::from_binary(binary_data).map_err(|e| {
95 SerializationError::ValidationFailed {
96 field: field_name.to_string(),
97 message: format!("Failed to deserialize ContactInfo from binary: {}", e),
98 }
99 });
100 }
101
102 Err(SerializationError::ValidationFailed {
103 field: field_name.to_string(),
104 message: format!(
105 "Expected JsonObject or BinaryObject for ContactInfo, found {}",
106 value.type_name()
107 ),
108 })
109 }
110}
111
112/// Address struct
113#[derive(Debug, Clone, PartialEq)]
114pub struct Address {
115 pub street: String,
116 pub city: String,
117 pub state: String,
118 pub postal_code: String,
119 pub country: String,
120}
121
122impl StructSerializable for Address {
123 fn to_serializer(&self) -> StructSerializer {
124 StructSerializer::new()
125 .field("street", &self.street)
126 .field("city", &self.city)
127 .field("state", &self.state)
128 .field("postal_code", &self.postal_code)
129 .field("country", &self.country)
130 }
131
132 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
133 let street = deserializer.field("street")?;
134 let city = deserializer.field("city")?;
135 let state = deserializer.field("state")?;
136 let postal_code = deserializer.field("postal_code")?;
137 let country = deserializer.field("country")?;
138
139 Ok(Address {
140 street,
141 city,
142 state,
143 postal_code,
144 country,
145 })
146 }
147}
148
149impl ToFieldValue for Address {
150 fn to_field_value(&self) -> FieldValue {
151 match self.to_json() {
152 Ok(json_str) => FieldValue::from_json_object(json_str),
153 Err(_) => FieldValue::from_string("serialization_error".to_string()),
154 }
155 }
156}
157
158impl FromFieldValue for Address {
159 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
160 // Try JSON object first
161 if let Ok(json_data) = value.as_json_object() {
162 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
163 field: field_name.to_string(),
164 message: format!("Failed to deserialize Address from JSON: {}", e),
165 });
166 }
167
168 // Try binary object
169 if let Ok(binary_data) = value.as_binary_object() {
170 return Self::from_binary(binary_data).map_err(|e| {
171 SerializationError::ValidationFailed {
172 field: field_name.to_string(),
173 message: format!("Failed to deserialize Address from binary: {}", e),
174 }
175 });
176 }
177
178 Err(SerializationError::ValidationFailed {
179 field: field_name.to_string(),
180 message: format!(
181 "Expected JsonObject or BinaryObject for Address, found {}",
182 value.type_name()
183 ),
184 })
185 }
186}
187
188/// Project information struct
189#[derive(Debug, Clone, PartialEq)]
190pub struct Project {
191 pub name: String,
192 pub description: String,
193 pub status: ProjectStatus,
194 pub budget: f64,
195 pub team_members: Vec<String>,
196 pub milestones: Vec<Milestone>,
197 pub metadata: HashMap<String, String>,
198}
199
200impl StructSerializable for Project {
201 fn to_serializer(&self) -> StructSerializer {
202 StructSerializer::new()
203 .field("name", &self.name)
204 .field("description", &self.description)
205 .field("status", &self.status)
206 .field("budget", &self.budget)
207 .field("team_members", &self.team_members)
208 .field("milestones", &self.milestones)
209 .field("metadata", &self.metadata)
210 }
211
212 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
213 let name = deserializer.field("name")?;
214 let description = deserializer.field("description")?;
215 let status = deserializer.field("status")?;
216 let budget = deserializer.field("budget")?;
217 let team_members = deserializer.field("team_members")?;
218 let milestones = deserializer.field("milestones")?;
219 let metadata = deserializer.field("metadata")?;
220
221 Ok(Project {
222 name,
223 description,
224 status,
225 budget,
226 team_members,
227 milestones,
228 metadata,
229 })
230 }
231}
232
233impl ToFieldValue for Project {
234 fn to_field_value(&self) -> FieldValue {
235 match self.to_json() {
236 Ok(json_str) => FieldValue::from_json_object(json_str),
237 Err(_) => FieldValue::from_string("serialization_error".to_string()),
238 }
239 }
240}
241
242impl FromFieldValue for Project {
243 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
244 // Try JSON object first
245 if let Ok(json_data) = value.as_json_object() {
246 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
247 field: field_name.to_string(),
248 message: format!("Failed to deserialize Project from JSON: {}", e),
249 });
250 }
251
252 // Try binary object
253 if let Ok(binary_data) = value.as_binary_object() {
254 return Self::from_binary(binary_data).map_err(|e| {
255 SerializationError::ValidationFailed {
256 field: field_name.to_string(),
257 message: format!("Failed to deserialize Project from binary: {}", e),
258 }
259 });
260 }
261
262 Err(SerializationError::ValidationFailed {
263 field: field_name.to_string(),
264 message: format!(
265 "Expected JsonObject or BinaryObject for Project, found {}",
266 value.type_name()
267 ),
268 })
269 }
270}
271
272/// Project status enumeration
273#[derive(Debug, Clone, PartialEq)]
274pub enum ProjectStatus {
275 Planning,
276 InProgress,
277 OnHold,
278 Completed,
279 Cancelled,
280}
281
282impl ToFieldValue for ProjectStatus {
283 fn to_field_value(&self) -> FieldValue {
284 let status_str = match self {
285 ProjectStatus::Planning => "planning",
286 ProjectStatus::InProgress => "in_progress",
287 ProjectStatus::OnHold => "on_hold",
288 ProjectStatus::Completed => "completed",
289 ProjectStatus::Cancelled => "cancelled",
290 };
291 FieldValue::from_string(status_str.to_string())
292 }
293}
294
295impl FromFieldValue for ProjectStatus {
296 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
297 match value {
298 FieldValue::String(s) => match s.as_str() {
299 "planning" => Ok(ProjectStatus::Planning),
300 "in_progress" => Ok(ProjectStatus::InProgress),
301 "on_hold" => Ok(ProjectStatus::OnHold),
302 "completed" => Ok(ProjectStatus::Completed),
303 "cancelled" => Ok(ProjectStatus::Cancelled),
304 _ => Err(SerializationError::ValidationFailed {
305 field: field_name.to_string(),
306 message: format!("Unknown project status: {}", s),
307 }),
308 },
309 _ => Err(SerializationError::ValidationFailed {
310 field: field_name.to_string(),
311 message: format!(
312 "Expected String for ProjectStatus, found {}",
313 value.type_name()
314 ),
315 }),
316 }
317 }
318}
319
320/// Project milestone struct
321#[derive(Debug, Clone, PartialEq)]
322pub struct Milestone {
323 pub name: String,
324 pub description: String,
325 pub due_date: String, // Simplified as string for this example
326 pub is_completed: bool,
327 pub progress_percentage: f32,
328 pub dependencies: Vec<String>,
329}
330
331impl StructSerializable for Milestone {
332 fn to_serializer(&self) -> StructSerializer {
333 StructSerializer::new()
334 .field("name", &self.name)
335 .field("description", &self.description)
336 .field("due_date", &self.due_date)
337 .field("is_completed", &self.is_completed)
338 .field("progress_percentage", &self.progress_percentage)
339 .field("dependencies", &self.dependencies)
340 }
341
342 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
343 let name = deserializer.field("name")?;
344 let description = deserializer.field("description")?;
345 let due_date = deserializer.field("due_date")?;
346 let is_completed = deserializer.field("is_completed")?;
347 let progress_percentage = deserializer.field("progress_percentage")?;
348 let dependencies = deserializer.field("dependencies")?;
349
350 Ok(Milestone {
351 name,
352 description,
353 due_date,
354 is_completed,
355 progress_percentage,
356 dependencies,
357 })
358 }
359}
360
361impl ToFieldValue for Milestone {
362 fn to_field_value(&self) -> FieldValue {
363 match self.to_json() {
364 Ok(json_str) => FieldValue::from_json_object(json_str),
365 Err(_) => FieldValue::from_string("serialization_error".to_string()),
366 }
367 }
368}
369
370impl FromFieldValue for Milestone {
371 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
372 // Try JSON object first
373 if let Ok(json_data) = value.as_json_object() {
374 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
375 field: field_name.to_string(),
376 message: format!("Failed to deserialize Milestone from JSON: {}", e),
377 });
378 }
379
380 // Try binary object
381 if let Ok(binary_data) = value.as_binary_object() {
382 return Self::from_binary(binary_data).map_err(|e| {
383 SerializationError::ValidationFailed {
384 field: field_name.to_string(),
385 message: format!("Failed to deserialize Milestone from binary: {}", e),
386 }
387 });
388 }
389
390 Err(SerializationError::ValidationFailed {
391 field: field_name.to_string(),
392 message: format!(
393 "Expected JsonObject or BinaryObject for Milestone, found {}",
394 value.type_name()
395 ),
396 })
397 }
398}
399
400/// Company struct with basic collections and nesting
401#[derive(Debug, Clone, PartialEq)]
402pub struct Company {
403 pub name: String,
404 pub founded_year: i32,
405 pub headquarters_city: String,
406 pub headquarters_state: String,
407 pub employee_count: usize,
408 pub department_names: Vec<String>,
409 pub active_project_names: Vec<String>,
410 pub company_metadata: HashMap<String, String>,
411}
412
413impl StructSerializable for Company {
414 fn to_serializer(&self) -> StructSerializer {
415 StructSerializer::new()
416 .field("name", &self.name)
417 .field("founded_year", &self.founded_year)
418 .field("headquarters_city", &self.headquarters_city)
419 .field("headquarters_state", &self.headquarters_state)
420 .field("employee_count", &self.employee_count)
421 .field("department_names", &self.department_names)
422 .field("active_project_names", &self.active_project_names)
423 .field("company_metadata", &self.company_metadata)
424 }
425
426 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
427 let name = deserializer.field("name")?;
428 let founded_year = deserializer.field("founded_year")?;
429 let headquarters_city = deserializer.field("headquarters_city")?;
430 let headquarters_state = deserializer.field("headquarters_state")?;
431 let employee_count = deserializer.field("employee_count")?;
432 let department_names = deserializer.field("department_names")?;
433 let active_project_names = deserializer.field("active_project_names")?;
434 let company_metadata = deserializer.field("company_metadata")?;
435
436 Ok(Company {
437 name,
438 founded_year,
439 headquarters_city,
440 headquarters_state,
441 employee_count,
442 department_names,
443 active_project_names,
444 company_metadata,
445 })
446 }
447}
448
449impl ToFieldValue for Company {
450 fn to_field_value(&self) -> FieldValue {
451 match self.to_json() {
452 Ok(json_str) => FieldValue::from_json_object(json_str),
453 Err(_) => FieldValue::from_string("serialization_error".to_string()),
454 }
455 }
456}
457
458impl FromFieldValue for Company {
459 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
460 // Try JSON object first
461 if let Ok(json_data) = value.as_json_object() {
462 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
463 field: field_name.to_string(),
464 message: format!("Failed to deserialize Company from JSON: {}", e),
465 });
466 }
467
468 // Try binary object
469 if let Ok(binary_data) = value.as_binary_object() {
470 return Self::from_binary(binary_data).map_err(|e| {
471 SerializationError::ValidationFailed {
472 field: field_name.to_string(),
473 message: format!("Failed to deserialize Company from binary: {}", e),
474 }
475 });
476 }
477
478 Err(SerializationError::ValidationFailed {
479 field: field_name.to_string(),
480 message: format!(
481 "Expected JsonObject or BinaryObject for Company, found {}",
482 value.type_name()
483 ),
484 })
485 }
486}
487
488/// Department struct
489#[derive(Debug, Clone, PartialEq)]
490pub struct Department {
491 pub name: String,
492 pub manager: String,
493 pub employee_count: u32,
494 pub budget: f64,
495 pub office_locations: Vec<Address>,
496}
497
498impl StructSerializable for Department {
499 fn to_serializer(&self) -> StructSerializer {
500 StructSerializer::new()
501 .field("name", &self.name)
502 .field("manager", &self.manager)
503 .field("employee_count", &self.employee_count)
504 .field("budget", &self.budget)
505 .field("office_locations", &self.office_locations)
506 }
507
508 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
509 let name = deserializer.field("name")?;
510 let manager = deserializer.field("manager")?;
511 let employee_count = deserializer.field("employee_count")?;
512 let budget = deserializer.field("budget")?;
513 let office_locations = deserializer.field("office_locations")?;
514
515 Ok(Department {
516 name,
517 manager,
518 employee_count,
519 budget,
520 office_locations,
521 })
522 }
523}
524
525impl ToFieldValue for Department {
526 fn to_field_value(&self) -> FieldValue {
527 match self.to_json() {
528 Ok(json_str) => FieldValue::from_json_object(json_str),
529 Err(_) => FieldValue::from_string("serialization_error".to_string()),
530 }
531 }
532}
533
534impl FromFieldValue for Department {
535 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
536 // Try JSON object first
537 if let Ok(json_data) = value.as_json_object() {
538 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
539 field: field_name.to_string(),
540 message: format!("Failed to deserialize Department from JSON: {}", e),
541 });
542 }
543
544 // Try binary object
545 if let Ok(binary_data) = value.as_binary_object() {
546 return Self::from_binary(binary_data).map_err(|e| {
547 SerializationError::ValidationFailed {
548 field: field_name.to_string(),
549 message: format!("Failed to deserialize Department from binary: {}", e),
550 }
551 });
552 }
553
554 Err(SerializationError::ValidationFailed {
555 field: field_name.to_string(),
556 message: format!(
557 "Expected JsonObject or BinaryObject for Department, found {}",
558 value.type_name()
559 ),
560 })
561 }
562}
563
564fn main() -> Result<(), Box<dyn std::error::Error>> {
565 println!("=== Nested Structures Serialization Example ===\n");
566
567 demonstrate_nested_struct_creation()?;
568 demonstrate_deep_serialization()?;
569 demonstrate_collection_nesting()?;
570 demonstrate_partial_loading()?;
571 demonstrate_performance_analysis()?;
572 cleanup_temp_files()?;
573
574 println!("\n=== Example completed successfully! ===");
575 Ok(())
576}
577
578/// Demonstrate creating complex nested structures
579fn demonstrate_nested_struct_creation() -> Result<(), Box<dyn std::error::Error>> {
580 println!("--- Nested Structure Creation ---");
581
582 // Create nested address and contact info
583 let headquarters = Address {
584 street: "123 Innovation Drive".to_string(),
585 city: "Tech City".to_string(),
586 state: "CA".to_string(),
587 postal_code: "94000".to_string(),
588 country: "USA".to_string(),
589 };
590
591 let mut social_media = HashMap::new();
592 social_media.insert("twitter".to_string(), "@techcorp".to_string());
593 social_media.insert("linkedin".to_string(), "techcorp-inc".to_string());
594
595 let contact_info = ContactInfo {
596 email: "info@techcorp.com".to_string(),
597 phone: Some("+1-555-0123".to_string()),
598 address_city: headquarters.city.clone(),
599 address_state: headquarters.state.clone(),
600 social_media,
601 };
602
603 // Create departments with nested office locations
604 let engineering_office = Address {
605 street: "456 Developer Lane".to_string(),
606 city: "Code City".to_string(),
607 state: "CA".to_string(),
608 postal_code: "94001".to_string(),
609 country: "USA".to_string(),
610 };
611
612 let departments = [
613 Department {
614 name: "Engineering".to_string(),
615 manager: "Alice Johnson".to_string(),
616 employee_count: 50,
617 budget: 2500000.0,
618 office_locations: vec![engineering_office, headquarters.clone()],
619 },
620 Department {
621 name: "Marketing".to_string(),
622 manager: "Bob Smith".to_string(),
623 employee_count: 15,
624 budget: 800000.0,
625 office_locations: vec![headquarters.clone()],
626 },
627 ];
628
629 // Create projects with milestones
630 let milestones = vec![
631 Milestone {
632 name: "Requirements Analysis".to_string(),
633 description: "Complete system requirements documentation".to_string(),
634 due_date: "2024-03-15".to_string(),
635 is_completed: true,
636 progress_percentage: 100.0,
637 dependencies: vec![],
638 },
639 Milestone {
640 name: "Architecture Design".to_string(),
641 description: "Define system architecture and components".to_string(),
642 due_date: "2024-04-01".to_string(),
643 is_completed: false,
644 progress_percentage: 75.0,
645 dependencies: vec!["Requirements Analysis".to_string()],
646 },
647 ];
648
649 let mut project_metadata = HashMap::new();
650 project_metadata.insert("priority".to_string(), "high".to_string());
651 project_metadata.insert("client".to_string(), "internal".to_string());
652
653 let projects = [Project {
654 name: "Train Station ML Platform".to_string(),
655 description: "Next-generation machine learning infrastructure".to_string(),
656 status: ProjectStatus::InProgress,
657 budget: 1500000.0,
658 team_members: vec![
659 "Alice Johnson".to_string(),
660 "Charlie Brown".to_string(),
661 "Diana Prince".to_string(),
662 ],
663 milestones: milestones.clone(),
664 metadata: project_metadata,
665 }];
666
667 // Create the complete company structure
668 let mut company_metadata = HashMap::new();
669 company_metadata.insert("industry".to_string(), "technology".to_string());
670 company_metadata.insert("stock_symbol".to_string(), "TECH".to_string());
671
672 let company = Company {
673 name: "TechCorp Inc.".to_string(),
674 founded_year: 2015,
675 headquarters_city: headquarters.city.clone(),
676 headquarters_state: headquarters.state.clone(),
677 employee_count: 250,
678 department_names: departments.iter().map(|d| d.name.clone()).collect(),
679 active_project_names: projects.iter().map(|p| p.name.clone()).collect(),
680 company_metadata,
681 };
682
683 println!("Created complex company structure:");
684 println!(" Company: {}", company.name);
685 println!(" Founded: {}", company.founded_year);
686 println!(
687 " Headquarters: {}, {}",
688 company.headquarters_city, company.headquarters_state
689 );
690 println!(" Employee Count: {}", company.employee_count);
691 println!(" Departments: {}", company.department_names.len());
692 println!(" Active Projects: {}", company.active_project_names.len());
693
694 // Save the complete structure
695 company.save_json("temp_nested_company.json")?;
696 println!("Saved nested structure to: temp_nested_company.json");
697
698 // Verify loading preserves all nested data
699 let loaded_company = Company::load_json("temp_nested_company.json")?;
700 assert_eq!(company, loaded_company);
701 println!("Successfully verified Company roundtrip serialization");
702
703 // Also demonstrate individual component serialization
704 let address_json = headquarters.to_json()?;
705 let loaded_address = Address::from_json(&address_json)?;
706 assert_eq!(headquarters, loaded_address);
707 println!("Successfully serialized/deserialized Address component");
708
709 let contact_json = contact_info.to_json()?;
710 let loaded_contact = ContactInfo::from_json(&contact_json)?;
711 assert_eq!(contact_info, loaded_contact);
712 println!("Successfully serialized/deserialized ContactInfo component");
713 println!("Nested structure integrity: VERIFIED");
714
715 Ok(())
716}
717
718/// Demonstrate deep serialization with complex nesting
719fn demonstrate_deep_serialization() -> Result<(), Box<dyn std::error::Error>> {
720 println!("\n--- Deep Serialization Analysis ---");
721
722 let deep_milestone = Milestone {
723 name: "Deep Milestone".to_string(),
724 description: "Testing deep nesting serialization".to_string(),
725 due_date: "2024-12-31".to_string(),
726 is_completed: false,
727 progress_percentage: 50.0,
728 dependencies: vec!["Parent Task".to_string(), "Sibling Task".to_string()],
729 };
730
731 let deep_project = Project {
732 name: "Deep Nesting Test".to_string(),
733 description: "Project for testing serialization depth".to_string(),
734 status: ProjectStatus::Planning,
735 budget: 100000.0,
736 team_members: vec!["Developer 1".to_string(), "Developer 2".to_string()],
737 milestones: vec![deep_milestone],
738 metadata: HashMap::new(),
739 };
740
741 // Analyze serialization output
742 let json_output = deep_project.to_json()?;
743 let binary_output = deep_project.to_binary()?;
744
745 println!("Deep structure serialization analysis:");
746 println!(" JSON size: {} bytes", json_output.len());
747 println!(" Binary size: {} bytes", binary_output.len());
748 println!(" Nesting levels: Address -> Project -> Milestone -> Dependencies");
749
750 // Count nested objects in JSON (rough estimate)
751 let object_count = json_output.matches('{').count();
752 let array_count = json_output.matches('[').count();
753 println!(" JSON objects: {}", object_count);
754 println!(" JSON arrays: {}", array_count);
755
756 // Verify deep roundtrip
757 let json_parsed = Project::from_json(&json_output)?;
758 let binary_parsed = Project::from_binary(&binary_output)?;
759
760 assert_eq!(deep_project, json_parsed);
761 assert_eq!(deep_project, binary_parsed);
762 println!("Deep serialization roundtrip: VERIFIED");
763
764 Ok(())
765}
766
767/// Demonstrate collection nesting patterns
768fn demonstrate_collection_nesting() -> Result<(), Box<dyn std::error::Error>> {
769 println!("\n--- Collection Nesting Patterns ---");
770
771 // Create multiple departments with varying complexity
772 let departments = vec![
773 Department {
774 name: "Research".to_string(),
775 manager: "Dr. Science".to_string(),
776 employee_count: 25,
777 budget: 1200000.0,
778 office_locations: vec![
779 Address {
780 street: "1 Research Blvd".to_string(),
781 city: "Innovation Hub".to_string(),
782 state: "MA".to_string(),
783 postal_code: "02101".to_string(),
784 country: "USA".to_string(),
785 },
786 Address {
787 street: "2 Lab Street".to_string(),
788 city: "Tech Valley".to_string(),
789 state: "NY".to_string(),
790 postal_code: "12180".to_string(),
791 country: "USA".to_string(),
792 },
793 ],
794 },
795 Department {
796 name: "Quality Assurance".to_string(),
797 manager: "Test Master".to_string(),
798 employee_count: 12,
799 budget: 600000.0,
800 office_locations: vec![], // Empty collection
801 },
802 ];
803
804 println!("Collection nesting analysis:");
805 println!(" Departments: {}", departments.len());
806
807 let total_locations: usize = departments.iter().map(|d| d.office_locations.len()).sum();
808 println!(" Total office locations: {}", total_locations);
809
810 // Test serialization with mixed empty and populated collections
811 // Note: Vec<Department> doesn't implement StructSerializable directly.
812 // For this example, we'll serialize each department individually
813 let department_json_strings: Result<Vec<String>, _> =
814 departments.iter().map(|dept| dept.to_json()).collect();
815 let department_json_strings = department_json_strings?;
816
817 // Deserialize each department back
818 let parsed_departments: Result<Vec<Department>, _> = department_json_strings
819 .iter()
820 .map(|json_str| Department::from_json(json_str))
821 .collect();
822 let parsed_departments = parsed_departments?;
823
824 assert_eq!(departments, parsed_departments);
825 println!("Collection nesting serialization: VERIFIED");
826
827 // Analyze collection patterns
828 for (i, dept) in departments.iter().enumerate() {
829 println!(
830 " Department {}: {} locations",
831 i + 1,
832 dept.office_locations.len()
833 );
834 }
835
836 Ok(())
837}
838
839/// Demonstrate partial loading and field access
840fn demonstrate_partial_loading() -> Result<(), Box<dyn std::error::Error>> {
841 println!("\n--- Partial Loading and Field Access ---");
842
843 // Create a simple project for analysis
844 let project = Project {
845 name: "Sample Project".to_string(),
846 description: "For testing partial loading".to_string(),
847 status: ProjectStatus::InProgress,
848 budget: 50000.0,
849 team_members: vec!["Alice".to_string(), "Bob".to_string()],
850 milestones: vec![Milestone {
851 name: "Phase 1".to_string(),
852 description: "Initial phase".to_string(),
853 due_date: "2024-06-01".to_string(),
854 is_completed: true,
855 progress_percentage: 100.0,
856 dependencies: vec![],
857 }],
858 metadata: HashMap::new(),
859 };
860
861 // Convert to JSON and analyze structure
862 println!("Project JSON structure analysis:");
863
864 // Parse to examine available fields by inspecting JSON structure
865 let json_data = project.to_json()?;
866 let field_count = json_data.matches(':').count();
867 println!(" Estimated fields: {}", field_count);
868
869 // Show top-level structure
870 let lines: Vec<&str> = json_data.lines().take(10).collect();
871 println!(" JSON structure preview:");
872 for line in lines.iter().take(5) {
873 if let Some(colon_pos) = line.find(':') {
874 let field_name = line[..colon_pos].trim().trim_matches('"').trim();
875 if !field_name.is_empty() {
876 println!(" - {}", field_name);
877 }
878 }
879 }
880
881 // Demonstrate field type analysis
882 println!("\nField type analysis:");
883 println!(" name: String");
884 println!(" status: Enum -> String");
885 println!(" budget: f64 -> Number");
886 println!(" team_members: Vec<String> -> Array");
887 println!(" milestones: Vec<Milestone> -> Array of Objects");
888
889 Ok(())
890}
891
892/// Demonstrate performance analysis for nested structures
893fn demonstrate_performance_analysis() -> Result<(), Box<dyn std::error::Error>> {
894 println!("\n--- Performance Analysis ---");
895
896 // Create structures of varying complexity
897 let simple_address = Address {
898 street: "123 Main St".to_string(),
899 city: "Anytown".to_string(),
900 state: "ST".to_string(),
901 postal_code: "12345".to_string(),
902 country: "USA".to_string(),
903 };
904
905 let complex_department = Department {
906 name: "Complex Department".to_string(),
907 manager: "Manager Name".to_string(),
908 employee_count: 100,
909 budget: 5000000.0,
910 office_locations: vec![simple_address.clone(); 10], // 10 identical addresses
911 };
912
913 let complex_project = Project {
914 name: "Complex Project".to_string(),
915 description: "Large project with many components".to_string(),
916 status: ProjectStatus::InProgress,
917 budget: 2000000.0,
918 team_members: (1..=50).map(|i| format!("Team Member {}", i)).collect(),
919 milestones: (1..=20)
920 .map(|i| Milestone {
921 name: format!("Milestone {}", i),
922 description: format!("Description for milestone {}", i),
923 due_date: "2024-12-31".to_string(),
924 is_completed: i <= 10,
925 progress_percentage: if i <= 10 { 100.0 } else { 50.0 },
926 dependencies: if i > 1 {
927 vec![format!("Milestone {}", i - 1)]
928 } else {
929 vec![]
930 },
931 })
932 .collect(),
933 metadata: HashMap::new(),
934 };
935
936 // Measure serialization performance
937 println!("Performance comparison:");
938
939 // Simple address
940 let addr_json = simple_address.to_json()?;
941 let addr_binary = simple_address.to_binary()?;
942 println!(" Simple Address:");
943 println!(" JSON: {} bytes", addr_json.len());
944 println!(" Binary: {} bytes", addr_binary.len());
945
946 // Complex department
947 let dept_json = complex_department.to_json()?;
948 let dept_binary = complex_department.to_binary()?;
949 println!(" Complex Department (10 addresses):");
950 println!(" JSON: {} bytes", dept_json.len());
951 println!(" Binary: {} bytes", dept_binary.len());
952
953 // Complex project
954 let proj_json = complex_project.to_json()?;
955 let proj_binary = complex_project.to_binary()?;
956 println!(" Complex Project (50 members, 20 milestones):");
957 println!(" JSON: {} bytes", proj_json.len());
958 println!(" Binary: {} bytes", proj_binary.len());
959
960 // Calculate efficiency ratios
961 let dept_ratio = dept_json.len() as f64 / dept_binary.len() as f64;
962 let proj_ratio = proj_json.len() as f64 / proj_binary.len() as f64;
963
964 println!("\nFormat efficiency (JSON/Binary ratio):");
965 println!(" Department: {:.2}x", dept_ratio);
966 println!(" Project: {:.2}x", proj_ratio);
967
968 // Verify complex structure roundtrip
969 let proj_json_parsed = Project::from_json(&proj_json)?;
970 let proj_binary_parsed = Project::from_binary(&proj_binary)?;
971
972 assert_eq!(complex_project, proj_json_parsed);
973 assert_eq!(complex_project, proj_binary_parsed);
974 println!("Complex structure roundtrip: VERIFIED");
975
976 Ok(())
977}78 fn to_field_value(&self) -> FieldValue {
79 // Convert to JSON and then parse as FieldValue for nested object handling
80 match self.to_json() {
81 Ok(json_str) => {
82 // For examples, we'll serialize as JSON string for simplicity
83 FieldValue::from_json_object(json_str)
84 }
85 Err(_) => FieldValue::from_string("serialization_error".to_string()),
86 }
87 }
88}
89
90impl FromFieldValue for UserProfile {
91 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
92 // Try JSON object first
93 if let Ok(json_data) = value.as_json_object() {
94 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
95 field: field_name.to_string(),
96 message: format!("Failed to deserialize UserProfile from JSON: {}", e),
97 });
98 }
99
100 // Try binary object
101 if let Ok(binary_data) = value.as_binary_object() {
102 return Self::from_binary(binary_data).map_err(|e| {
103 SerializationError::ValidationFailed {
104 field: field_name.to_string(),
105 message: format!("Failed to deserialize UserProfile from binary: {}", e),
106 }
107 });
108 }
109
110 Err(SerializationError::ValidationFailed {
111 field: field_name.to_string(),
112 message: format!(
113 "Expected JsonObject or BinaryObject for UserProfile, found {}",
114 value.type_name()
115 ),
116 })
117 }
118}
119
120/// Application settings struct with optional fields and collections
121#[derive(Debug, Clone, PartialEq)]
122pub struct AppSettings {
123 pub app_name: String,
124 pub version: String,
125 pub debug_mode: bool,
126 pub max_connections: u32,
127 pub timeout_seconds: f32,
128 pub features: Vec<String>,
129 pub environment_vars: HashMap<String, String>,
130 pub optional_database_url: Option<String>,
131}
132
133impl StructSerializable for AppSettings {
134 fn to_serializer(&self) -> StructSerializer {
135 StructSerializer::new()
136 .field("app_name", &self.app_name)
137 .field("version", &self.version)
138 .field("debug_mode", &self.debug_mode)
139 .field("max_connections", &self.max_connections)
140 .field("timeout_seconds", &self.timeout_seconds)
141 .field("features", &self.features)
142 .field("environment_vars", &self.environment_vars)
143 .field("optional_database_url", &self.optional_database_url)
144 }
145
146 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
147 let app_name = deserializer.field("app_name")?;
148 let version = deserializer.field("version")?;
149 let debug_mode = deserializer.field("debug_mode")?;
150 let max_connections = deserializer.field("max_connections")?;
151 let timeout_seconds = deserializer.field("timeout_seconds")?;
152 let features = deserializer.field("features")?;
153 let environment_vars = deserializer.field("environment_vars")?;
154 let optional_database_url = deserializer.field("optional_database_url")?;
155
156 Ok(AppSettings {
157 app_name,
158 version,
159 debug_mode,
160 max_connections,
161 timeout_seconds,
162 features,
163 environment_vars,
164 optional_database_url,
165 })
166 }
167}
168
169impl ToFieldValue for AppSettings {
170 fn to_field_value(&self) -> FieldValue {
171 // Convert to JSON and then parse as FieldValue for nested object handling
172 match self.to_json() {
173 Ok(json_str) => FieldValue::from_json_object(json_str),
174 Err(_) => FieldValue::from_string("serialization_error".to_string()),
175 }
176 }
177}
178
179impl FromFieldValue for AppSettings {
180 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
181 // Try JSON object first
182 if let Ok(json_data) = value.as_json_object() {
183 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
184 field: field_name.to_string(),
185 message: format!("Failed to deserialize AppSettings from JSON: {}", e),
186 });
187 }
188
189 // Try binary object
190 if let Ok(binary_data) = value.as_binary_object() {
191 return Self::from_binary(binary_data).map_err(|e| {
192 SerializationError::ValidationFailed {
193 field: field_name.to_string(),
194 message: format!("Failed to deserialize AppSettings from binary: {}", e),
195 }
196 });
197 }
198
199 Err(SerializationError::ValidationFailed {
200 field: field_name.to_string(),
201 message: format!(
202 "Expected JsonObject or BinaryObject for AppSettings, found {}",
203 value.type_name()
204 ),
205 })
206 }
207}
208
209fn main() -> Result<(), Box<dyn std::error::Error>> {
210 println!("=== Basic Struct Serialization Example ===\n");
211
212 demonstrate_user_profile_serialization()?;
213 demonstrate_app_settings_serialization()?;
214 demonstrate_format_comparison()?;
215 demonstrate_roundtrip_verification()?;
216 demonstrate_field_access_patterns()?;
217 cleanup_temp_files()?;
218
219 println!("\n=== Example completed successfully! ===");
220 Ok(())
221}
222
223/// Demonstrate basic struct serialization with simple field types
224fn demonstrate_user_profile_serialization() -> Result<(), Box<dyn std::error::Error>> {
225 println!("--- User Profile Serialization ---");
226
227 // Create a user profile with various field types
228 let user = UserProfile {
229 id: 12345,
230 username: "alice_cooper".to_string(),
231 email: "alice@example.com".to_string(),
232 age: 28,
233 is_active: true,
234 score: 95.7,
235 };
236
237 println!("Original user profile:");
238 println!(" ID: {}", user.id);
239 println!(" Username: {}", user.username);
240 println!(" Email: {}", user.email);
241 println!(" Age: {}", user.age);
242 println!(" Active: {}", user.is_active);
243 println!(" Score: {}", user.score);
244
245 // Serialize to JSON
246 let json_data = user.to_json()?;
247 println!("\nSerialized to JSON:");
248 println!("{}", json_data);
249
250 // Save to JSON file
251 user.save_json("temp_user_profile.json")?;
252 println!("Saved to file: temp_user_profile.json");
253
254 // Load from JSON file
255 let loaded_user = UserProfile::load_json("temp_user_profile.json")?;
256 println!("\nLoaded user profile:");
257 println!(" ID: {}", loaded_user.id);
258 println!(" Username: {}", loaded_user.username);
259 println!(" Email: {}", loaded_user.email);
260 println!(" Age: {}", loaded_user.age);
261 println!(" Active: {}", loaded_user.is_active);
262 println!(" Score: {}", loaded_user.score);
263
264 // Verify data integrity
265 assert_eq!(user, loaded_user);
266 println!("Data integrity verification: PASSED");
267
268 Ok(())
269}
270
271/// Demonstrate serialization with collections and optional fields
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}
386
387/// Demonstrate roundtrip verification with multiple data variations
388fn demonstrate_roundtrip_verification() -> Result<(), Box<dyn std::error::Error>> {
389 println!("\n--- Roundtrip Verification ---");
390
391 // Test various data patterns
392 let test_users = [
393 UserProfile {
394 id: 0,
395 username: "".to_string(),
396 email: "empty@test.com".to_string(),
397 age: 0,
398 is_active: false,
399 score: 0.0,
400 },
401 UserProfile {
402 id: u32::MAX,
403 username: "maximal_user_with_very_long_name_123456789".to_string(),
404 email: "test@verylongdomainname.example.org".to_string(),
405 age: i32::MAX,
406 is_active: true,
407 score: 999999.5,
408 },
409 UserProfile {
410 id: 42,
411 username: "unicode_tëst_🦀".to_string(),
412 email: "unicode@tëst.com".to_string(),
413 age: 25,
414 is_active: true,
415 score: -123.456,
416 },
417 ];
418
419 println!(
420 "Testing roundtrip serialization with {} variations:",
421 test_users.len()
422 );
423
424 for (i, user) in test_users.iter().enumerate() {
425 println!(
426 " Test case {}: ID={}, Username='{}'",
427 i + 1,
428 user.id,
429 user.username
430 );
431
432 // JSON roundtrip
433 let json_data = user.to_json()?;
434 let json_parsed = UserProfile::from_json(&json_data)?;
435 assert_eq!(*user, json_parsed);
436
437 // Binary roundtrip
438 let binary_data = user.to_binary()?;
439 let binary_parsed = UserProfile::from_binary(&binary_data)?;
440 assert_eq!(*user, binary_parsed);
441
442 println!(" JSON roundtrip: PASSED");
443 println!(" Binary roundtrip: PASSED");
444 }
445
446 println!("All roundtrip tests: PASSED");
447
448 Ok(())
449}
450
451/// Demonstrate field access patterns and validation
452fn demonstrate_field_access_patterns() -> Result<(), Box<dyn std::error::Error>> {
453 println!("\n--- Field Access Patterns ---");
454
455 let settings = AppSettings {
456 app_name: "Field Test App".to_string(),
457 version: "2.1.0".to_string(),
458 debug_mode: false,
459 max_connections: 50,
460 timeout_seconds: 15.0,
461 features: vec!["basic".to_string(), "advanced".to_string()],
462 environment_vars: HashMap::new(),
463 optional_database_url: None,
464 };
465
466 // Convert to JSON to inspect structure
467 let json_data = settings.to_json()?;
468 println!("JSON structure for field inspection:");
469
470 // Count approximate fields by counting field separators
471 let field_count = json_data.matches(':').count();
472 println!("Estimated fields: {}", field_count);
473
474 // Show structure (first few lines)
475 let lines: Vec<&str> = json_data.lines().take(5).collect();
476 for line in lines {
477 println!(" {}", line.trim());
478 }
479 if json_data.lines().count() > 5 {
480 println!(" ... ({} more lines)", json_data.lines().count() - 5);
481 }
482
483 // Demonstrate optional field handling
484 println!("\nOptional field handling:");
485 println!(
486 " Database URL is None: {}",
487 settings.optional_database_url.is_none()
488 );
489
490 // Create version with optional field populated
491 let settings_with_db = AppSettings {
492 optional_database_url: Some("sqlite:///tmp/test.db".to_string()),
493 ..settings.clone()
494 };
495
496 println!(
497 " Database URL with value: {:?}",
498 settings_with_db.optional_database_url
499 );
500
501 // Verify both versions serialize/deserialize correctly
502 let json_none = settings.to_json()?;
503 let json_some = settings_with_db.to_json()?;
504
505 let parsed_none = AppSettings::from_json(&json_none)?;
506 let parsed_some = AppSettings::from_json(&json_some)?;
507
508 assert_eq!(settings, parsed_none);
509 assert_eq!(settings_with_db, parsed_some);
510 assert!(parsed_none.optional_database_url.is_none());
511 assert!(parsed_some.optional_database_url.is_some());
512
513 println!("Optional field serialization: PASSED");
514
515 Ok(())
516}Sourcefn to_binary(&self) -> SerializationResult<Vec<u8>>
fn to_binary(&self) -> SerializationResult<Vec<u8>>
Converts the struct to binary data
Serializes the struct to a compact binary format optimized for efficient storage and transmission.
§Returns
Ok(Vec<u8>) containing the binary representation on success
Err(SerializationError) if serialization fails
§Examples
Converts struct to binary data with efficient compact format.
Examples found in repository?
719fn demonstrate_deep_serialization() -> Result<(), Box<dyn std::error::Error>> {
720 println!("\n--- Deep Serialization Analysis ---");
721
722 let deep_milestone = Milestone {
723 name: "Deep Milestone".to_string(),
724 description: "Testing deep nesting serialization".to_string(),
725 due_date: "2024-12-31".to_string(),
726 is_completed: false,
727 progress_percentage: 50.0,
728 dependencies: vec!["Parent Task".to_string(), "Sibling Task".to_string()],
729 };
730
731 let deep_project = Project {
732 name: "Deep Nesting Test".to_string(),
733 description: "Project for testing serialization depth".to_string(),
734 status: ProjectStatus::Planning,
735 budget: 100000.0,
736 team_members: vec!["Developer 1".to_string(), "Developer 2".to_string()],
737 milestones: vec![deep_milestone],
738 metadata: HashMap::new(),
739 };
740
741 // Analyze serialization output
742 let json_output = deep_project.to_json()?;
743 let binary_output = deep_project.to_binary()?;
744
745 println!("Deep structure serialization analysis:");
746 println!(" JSON size: {} bytes", json_output.len());
747 println!(" Binary size: {} bytes", binary_output.len());
748 println!(" Nesting levels: Address -> Project -> Milestone -> Dependencies");
749
750 // Count nested objects in JSON (rough estimate)
751 let object_count = json_output.matches('{').count();
752 let array_count = json_output.matches('[').count();
753 println!(" JSON objects: {}", object_count);
754 println!(" JSON arrays: {}", array_count);
755
756 // Verify deep roundtrip
757 let json_parsed = Project::from_json(&json_output)?;
758 let binary_parsed = Project::from_binary(&binary_output)?;
759
760 assert_eq!(deep_project, json_parsed);
761 assert_eq!(deep_project, binary_parsed);
762 println!("Deep serialization roundtrip: VERIFIED");
763
764 Ok(())
765}
766
767/// Demonstrate collection nesting patterns
768fn demonstrate_collection_nesting() -> Result<(), Box<dyn std::error::Error>> {
769 println!("\n--- Collection Nesting Patterns ---");
770
771 // Create multiple departments with varying complexity
772 let departments = vec![
773 Department {
774 name: "Research".to_string(),
775 manager: "Dr. Science".to_string(),
776 employee_count: 25,
777 budget: 1200000.0,
778 office_locations: vec![
779 Address {
780 street: "1 Research Blvd".to_string(),
781 city: "Innovation Hub".to_string(),
782 state: "MA".to_string(),
783 postal_code: "02101".to_string(),
784 country: "USA".to_string(),
785 },
786 Address {
787 street: "2 Lab Street".to_string(),
788 city: "Tech Valley".to_string(),
789 state: "NY".to_string(),
790 postal_code: "12180".to_string(),
791 country: "USA".to_string(),
792 },
793 ],
794 },
795 Department {
796 name: "Quality Assurance".to_string(),
797 manager: "Test Master".to_string(),
798 employee_count: 12,
799 budget: 600000.0,
800 office_locations: vec![], // Empty collection
801 },
802 ];
803
804 println!("Collection nesting analysis:");
805 println!(" Departments: {}", departments.len());
806
807 let total_locations: usize = departments.iter().map(|d| d.office_locations.len()).sum();
808 println!(" Total office locations: {}", total_locations);
809
810 // Test serialization with mixed empty and populated collections
811 // Note: Vec<Department> doesn't implement StructSerializable directly.
812 // For this example, we'll serialize each department individually
813 let department_json_strings: Result<Vec<String>, _> =
814 departments.iter().map(|dept| dept.to_json()).collect();
815 let department_json_strings = department_json_strings?;
816
817 // Deserialize each department back
818 let parsed_departments: Result<Vec<Department>, _> = department_json_strings
819 .iter()
820 .map(|json_str| Department::from_json(json_str))
821 .collect();
822 let parsed_departments = parsed_departments?;
823
824 assert_eq!(departments, parsed_departments);
825 println!("Collection nesting serialization: VERIFIED");
826
827 // Analyze collection patterns
828 for (i, dept) in departments.iter().enumerate() {
829 println!(
830 " Department {}: {} locations",
831 i + 1,
832 dept.office_locations.len()
833 );
834 }
835
836 Ok(())
837}
838
839/// Demonstrate partial loading and field access
840fn demonstrate_partial_loading() -> Result<(), Box<dyn std::error::Error>> {
841 println!("\n--- Partial Loading and Field Access ---");
842
843 // Create a simple project for analysis
844 let project = Project {
845 name: "Sample Project".to_string(),
846 description: "For testing partial loading".to_string(),
847 status: ProjectStatus::InProgress,
848 budget: 50000.0,
849 team_members: vec!["Alice".to_string(), "Bob".to_string()],
850 milestones: vec![Milestone {
851 name: "Phase 1".to_string(),
852 description: "Initial phase".to_string(),
853 due_date: "2024-06-01".to_string(),
854 is_completed: true,
855 progress_percentage: 100.0,
856 dependencies: vec![],
857 }],
858 metadata: HashMap::new(),
859 };
860
861 // Convert to JSON and analyze structure
862 println!("Project JSON structure analysis:");
863
864 // Parse to examine available fields by inspecting JSON structure
865 let json_data = project.to_json()?;
866 let field_count = json_data.matches(':').count();
867 println!(" Estimated fields: {}", field_count);
868
869 // Show top-level structure
870 let lines: Vec<&str> = json_data.lines().take(10).collect();
871 println!(" JSON structure preview:");
872 for line in lines.iter().take(5) {
873 if let Some(colon_pos) = line.find(':') {
874 let field_name = line[..colon_pos].trim().trim_matches('"').trim();
875 if !field_name.is_empty() {
876 println!(" - {}", field_name);
877 }
878 }
879 }
880
881 // Demonstrate field type analysis
882 println!("\nField type analysis:");
883 println!(" name: String");
884 println!(" status: Enum -> String");
885 println!(" budget: f64 -> Number");
886 println!(" team_members: Vec<String> -> Array");
887 println!(" milestones: Vec<Milestone> -> Array of Objects");
888
889 Ok(())
890}
891
892/// Demonstrate performance analysis for nested structures
893fn demonstrate_performance_analysis() -> Result<(), Box<dyn std::error::Error>> {
894 println!("\n--- Performance Analysis ---");
895
896 // Create structures of varying complexity
897 let simple_address = Address {
898 street: "123 Main St".to_string(),
899 city: "Anytown".to_string(),
900 state: "ST".to_string(),
901 postal_code: "12345".to_string(),
902 country: "USA".to_string(),
903 };
904
905 let complex_department = Department {
906 name: "Complex Department".to_string(),
907 manager: "Manager Name".to_string(),
908 employee_count: 100,
909 budget: 5000000.0,
910 office_locations: vec![simple_address.clone(); 10], // 10 identical addresses
911 };
912
913 let complex_project = Project {
914 name: "Complex Project".to_string(),
915 description: "Large project with many components".to_string(),
916 status: ProjectStatus::InProgress,
917 budget: 2000000.0,
918 team_members: (1..=50).map(|i| format!("Team Member {}", i)).collect(),
919 milestones: (1..=20)
920 .map(|i| Milestone {
921 name: format!("Milestone {}", i),
922 description: format!("Description for milestone {}", i),
923 due_date: "2024-12-31".to_string(),
924 is_completed: i <= 10,
925 progress_percentage: if i <= 10 { 100.0 } else { 50.0 },
926 dependencies: if i > 1 {
927 vec![format!("Milestone {}", i - 1)]
928 } else {
929 vec![]
930 },
931 })
932 .collect(),
933 metadata: HashMap::new(),
934 };
935
936 // Measure serialization performance
937 println!("Performance comparison:");
938
939 // Simple address
940 let addr_json = simple_address.to_json()?;
941 let addr_binary = simple_address.to_binary()?;
942 println!(" Simple Address:");
943 println!(" JSON: {} bytes", addr_json.len());
944 println!(" Binary: {} bytes", addr_binary.len());
945
946 // Complex department
947 let dept_json = complex_department.to_json()?;
948 let dept_binary = complex_department.to_binary()?;
949 println!(" Complex Department (10 addresses):");
950 println!(" JSON: {} bytes", dept_json.len());
951 println!(" Binary: {} bytes", dept_binary.len());
952
953 // Complex project
954 let proj_json = complex_project.to_json()?;
955 let proj_binary = complex_project.to_binary()?;
956 println!(" Complex Project (50 members, 20 milestones):");
957 println!(" JSON: {} bytes", proj_json.len());
958 println!(" Binary: {} bytes", proj_binary.len());
959
960 // Calculate efficiency ratios
961 let dept_ratio = dept_json.len() as f64 / dept_binary.len() as f64;
962 let proj_ratio = proj_json.len() as f64 / proj_binary.len() as f64;
963
964 println!("\nFormat efficiency (JSON/Binary ratio):");
965 println!(" Department: {:.2}x", dept_ratio);
966 println!(" Project: {:.2}x", proj_ratio);
967
968 // Verify complex structure roundtrip
969 let proj_json_parsed = Project::from_json(&proj_json)?;
970 let proj_binary_parsed = Project::from_binary(&proj_binary)?;
971
972 assert_eq!(complex_project, proj_json_parsed);
973 assert_eq!(complex_project, proj_binary_parsed);
974 println!("Complex structure roundtrip: VERIFIED");
975
976 Ok(())
977}More examples
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}
386
387/// Demonstrate roundtrip verification with multiple data variations
388fn demonstrate_roundtrip_verification() -> Result<(), Box<dyn std::error::Error>> {
389 println!("\n--- Roundtrip Verification ---");
390
391 // Test various data patterns
392 let test_users = [
393 UserProfile {
394 id: 0,
395 username: "".to_string(),
396 email: "empty@test.com".to_string(),
397 age: 0,
398 is_active: false,
399 score: 0.0,
400 },
401 UserProfile {
402 id: u32::MAX,
403 username: "maximal_user_with_very_long_name_123456789".to_string(),
404 email: "test@verylongdomainname.example.org".to_string(),
405 age: i32::MAX,
406 is_active: true,
407 score: 999999.5,
408 },
409 UserProfile {
410 id: 42,
411 username: "unicode_tëst_🦀".to_string(),
412 email: "unicode@tëst.com".to_string(),
413 age: 25,
414 is_active: true,
415 score: -123.456,
416 },
417 ];
418
419 println!(
420 "Testing roundtrip serialization with {} variations:",
421 test_users.len()
422 );
423
424 for (i, user) in test_users.iter().enumerate() {
425 println!(
426 " Test case {}: ID={}, Username='{}'",
427 i + 1,
428 user.id,
429 user.username
430 );
431
432 // JSON roundtrip
433 let json_data = user.to_json()?;
434 let json_parsed = UserProfile::from_json(&json_data)?;
435 assert_eq!(*user, json_parsed);
436
437 // Binary roundtrip
438 let binary_data = user.to_binary()?;
439 let binary_parsed = UserProfile::from_binary(&binary_data)?;
440 assert_eq!(*user, binary_parsed);
441
442 println!(" JSON roundtrip: PASSED");
443 println!(" Binary roundtrip: PASSED");
444 }
445
446 println!("All roundtrip tests: PASSED");
447
448 Ok(())
449}342fn demonstrate_format_characteristics() -> Result<(), Box<dyn std::error::Error>> {
343 println!("--- Format Characteristics ---");
344
345 // Create sample data structures
346 let mut metadata = HashMap::new();
347 metadata.insert("operation_type".to_string(), "benchmark".to_string());
348 metadata.insert("system".to_string(), "train_station".to_string());
349
350 let metrics = PerformanceMetrics {
351 operation: "tensor_multiplication".to_string(),
352 duration_micros: 1234,
353 memory_usage_bytes: 8192,
354 cpu_usage_percent: 75.5,
355 throughput_ops_per_sec: 1000.0,
356 metadata,
357 };
358
359 println!("Format characteristics analysis:");
360
361 // JSON characteristics
362 let json_data = metrics.to_json()?;
363 let json_lines = json_data.lines().count();
364 let json_chars = json_data.chars().count();
365
366 println!("\nJSON Format:");
367 println!(" Size: {} bytes", json_data.len());
368 println!(" Characters: {}", json_chars);
369 println!(" Lines: {}", json_lines);
370 println!(" Human readable: Yes");
371 println!(" Self-describing: Yes");
372 println!(" Cross-platform: Yes");
373 println!(" Compression ratio: Variable (depends on content)");
374
375 // Show sample JSON output
376 println!(" Sample output:");
377 for line in json_data.lines().take(3) {
378 println!(" {}", line);
379 }
380 if json_lines > 3 {
381 println!(" ... ({} more lines)", json_lines - 3);
382 }
383
384 // Binary characteristics
385 let binary_data = metrics.to_binary()?;
386
387 println!("\nBinary Format:");
388 println!(" Size: {} bytes", binary_data.len());
389 println!(" Human readable: No");
390 println!(" Self-describing: No (requires schema)");
391 println!(" Cross-platform: Yes (with proper endianness handling)");
392 println!(" Compression ratio: High (efficient encoding)");
393
394 // Show sample binary output (hex)
395 println!(" Sample output (first 32 bytes as hex):");
396 print!(" ");
397 for (i, byte) in binary_data.iter().take(32).enumerate() {
398 if i > 0 && i % 16 == 0 {
399 println!();
400 print!(" ");
401 }
402 print!("{:02x} ", byte);
403 }
404 if binary_data.len() > 32 {
405 println!("\n ... ({} more bytes)", binary_data.len() - 32);
406 } else {
407 println!();
408 }
409
410 // Verify roundtrip for both formats
411 let json_parsed = PerformanceMetrics::from_json(&json_data)?;
412 let binary_parsed = PerformanceMetrics::from_binary(&binary_data)?;
413
414 assert_eq!(metrics, json_parsed);
415 assert_eq!(metrics, binary_parsed);
416 println!("\nRoundtrip verification: PASSED");
417
418 Ok(())
419}
420
421/// Demonstrate size comparisons across different data types
422fn demonstrate_size_comparisons() -> Result<(), Box<dyn std::error::Error>> {
423 println!("\n--- Size Comparison Analysis ---");
424
425 // Test 1: Small configuration data (typical JSON use case)
426 let mut db_settings = HashMap::new();
427 db_settings.insert("host".to_string(), "localhost".to_string());
428 db_settings.insert("port".to_string(), "5432".to_string());
429 db_settings.insert("database".to_string(), "myapp".to_string());
430
431 let config = Configuration {
432 version: "1.2.3".to_string(),
433 debug_enabled: true,
434 log_level: "info".to_string(),
435 database_settings: db_settings,
436 feature_flags_enabled: true,
437 max_connections: 100.0,
438 timeout_seconds: 30.0,
439 };
440
441 // Test 2: Large numeric dataset (typical binary use case)
442 let large_dataset = LargeDataset {
443 name: "ML Training Data".to_string(),
444 values: (0..1000).map(|i| i as f32 * 0.1).collect(),
445 labels: (0..1000).map(|i| format!("label_{}", i)).collect(),
446 feature_count: 100,
447 feature_dimension: 50,
448 timestamp_count: 1000,
449 metadata: HashMap::new(),
450 };
451
452 println!("Size comparison results:");
453
454 // Configuration comparison
455 let config_json = config.to_json()?;
456 let config_binary = config.to_binary()?;
457
458 println!("\nConfiguration Data (small, text-heavy):");
459 println!(" JSON: {} bytes", config_json.len());
460 println!(" Binary: {} bytes", config_binary.len());
461 println!(
462 " Ratio (JSON/Binary): {:.2}x",
463 config_json.len() as f64 / config_binary.len() as f64
464 );
465 println!(" Recommendation: JSON (human readable, small size difference)");
466
467 // Large dataset comparison
468 let dataset_json = large_dataset.to_json()?;
469 let dataset_binary = large_dataset.to_binary()?;
470
471 println!("\nLarge Numeric Dataset (1000 values, 100x50 matrix):");
472 println!(
473 " JSON: {} bytes ({:.1} KB)",
474 dataset_json.len(),
475 dataset_json.len() as f64 / 1024.0
476 );
477 println!(
478 " Binary: {} bytes ({:.1} KB)",
479 dataset_binary.len(),
480 dataset_binary.len() as f64 / 1024.0
481 );
482 println!(
483 " Ratio (JSON/Binary): {:.2}x",
484 dataset_json.len() as f64 / dataset_binary.len() as f64
485 );
486 if dataset_json.len() > dataset_binary.len() {
487 println!(
488 " Space saved with binary: {} bytes ({:.1} KB)",
489 dataset_json.len() - dataset_binary.len(),
490 (dataset_json.len() - dataset_binary.len()) as f64 / 1024.0
491 );
492 println!(" Recommendation: Binary (significant size reduction)");
493 } else {
494 println!(
495 " Binary overhead: {} bytes ({:.1} KB)",
496 dataset_binary.len() - dataset_json.len(),
497 (dataset_binary.len() - dataset_json.len()) as f64 / 1024.0
498 );
499 println!(" Recommendation: JSON (binary overhead not justified for this size)");
500 }
501
502 // Content analysis
503 println!("\nContent Type Analysis:");
504
505 // Analyze JSON content patterns
506 let json_numbers = dataset_json.matches(char::is_numeric).count();
507 let json_brackets = dataset_json.matches('[').count() + dataset_json.matches(']').count();
508 let json_quotes = dataset_json.matches('"').count();
509
510 println!(" JSON overhead sources:");
511 println!(" Numeric characters: ~{}", json_numbers);
512 println!(" Brackets and commas: ~{}", json_brackets);
513 println!(" Quote marks: {}", json_quotes);
514 println!(" Formatting/whitespace: Varies");
515
516 println!(" Binary advantages:");
517 println!(" Direct numeric encoding: 4-8 bytes per number");
518 println!(" No formatting overhead: Zero bytes");
519 println!(" Efficient length encoding: Minimal bytes");
520
521 Ok(())
522}
523
524/// Demonstrate performance benchmarks
525fn demonstrate_performance_benchmarks() -> Result<(), Box<dyn std::error::Error>> {
526 println!("\n--- Performance Benchmark Analysis ---");
527
528 // Create test data of varying sizes
529 let small_config = Configuration {
530 version: "1.0.0".to_string(),
531 debug_enabled: false,
532 log_level: "warn".to_string(),
533 database_settings: HashMap::new(),
534 feature_flags_enabled: false,
535 max_connections: 100.0,
536 timeout_seconds: 30.0,
537 };
538
539 let large_dataset = LargeDataset {
540 name: "Large Dataset".to_string(),
541 values: (0..5000).map(|i| i as f32 * 0.001).collect(),
542 labels: (0..5000).map(|i| format!("large_item_{}", i)).collect(),
543 feature_count: 200,
544 feature_dimension: 25,
545 timestamp_count: 5000,
546 metadata: HashMap::new(),
547 };
548
549 println!("Performance benchmark results:");
550
551 // Benchmark each dataset (avoiding trait objects due to object safety)
552 let dataset_names = ["Small Config", "Large Dataset"];
553
554 for (i, name) in dataset_names.iter().enumerate() {
555 let mut comparison = FormatComparison::new(name.to_string());
556
557 // JSON serialization benchmark
558 let start = Instant::now();
559 let json_data = match i {
560 0 => small_config.to_json()?,
561 _ => large_dataset.to_json()?,
562 };
563 comparison.json_serialize_micros = start.elapsed().as_micros() as u64;
564 comparison.json_size_bytes = json_data.len() as u64;
565
566 // JSON deserialization benchmark (using PerformanceMetrics as example)
567 if *name == "Small Config" {
568 let start = Instant::now();
569 let _parsed = Configuration::from_json(&json_data)?;
570 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
571 } else {
572 let start = Instant::now();
573 let _parsed = LargeDataset::from_json(&json_data)?;
574 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
575 }
576
577 // Binary serialization benchmark
578 let start = Instant::now();
579 let binary_data = match i {
580 0 => small_config.to_binary()?,
581 _ => large_dataset.to_binary()?,
582 };
583 comparison.binary_serialize_micros = start.elapsed().as_micros() as u64;
584 comparison.binary_size_bytes = binary_data.len() as u64;
585
586 // Binary deserialization benchmark
587 if *name == "Small Config" {
588 let start = Instant::now();
589 let _parsed = Configuration::from_binary(&binary_data)?;
590 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
591 } else {
592 let start = Instant::now();
593 let _parsed = LargeDataset::from_binary(&binary_data)?;
594 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
595 }
596
597 // Calculate ratios
598 comparison.calculate_ratios();
599
600 // Display results
601 println!("\n{}:", name);
602 println!(
603 " Size - JSON: {} bytes, Binary: {} bytes (ratio: {:.2}x)",
604 comparison.json_size_bytes, comparison.binary_size_bytes, comparison.size_ratio
605 );
606 println!(
607 " Serialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
608 comparison.json_serialize_micros,
609 comparison.binary_serialize_micros,
610 comparison.serialize_speed_ratio
611 );
612 println!(
613 " Deserialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
614 comparison.json_deserialize_micros,
615 comparison.binary_deserialize_micros,
616 comparison.deserialize_speed_ratio
617 );
618 }
619
620 println!("\nPerformance Summary:");
621 println!(" - Binary format consistently uses less storage space");
622 println!(" - Performance differences vary by data type and size");
623 println!(" - Larger datasets show more significant binary advantages");
624 println!(" - JSON parsing overhead increases with structure complexity");
625
626 Ok(())
627}
628
629/// Demonstrate use case recommendations
630fn demonstrate_use_case_recommendations() -> Result<(), Box<dyn std::error::Error>> {
631 println!("\n--- Use Case Recommendations ---");
632
633 println!("JSON Format - Recommended for:");
634 println!(" ✓ Configuration files (human-editable)");
635 println!(" ✓ API responses (web compatibility)");
636 println!(" ✓ Debugging and development (readability)");
637 println!(" ✓ Small data structures (minimal overhead)");
638 println!(" ✓ Cross-language interoperability");
639 println!(" ✓ Schema evolution (self-describing)");
640 println!(" ✓ Text-heavy data with few numbers");
641
642 println!("\nBinary Format - Recommended for:");
643 println!(" ✓ Large datasets (memory/storage efficiency)");
644 println!(" ✓ High-performance applications (speed critical)");
645 println!(" ✓ Numeric-heavy data (ML models, matrices)");
646 println!(" ✓ Network transmission (bandwidth limited)");
647 println!(" ✓ Embedded systems (resource constrained)");
648 println!(" ✓ Long-term storage (space efficiency)");
649 println!(" ✓ Frequent serialization/deserialization");
650
651 // Demonstrate decision matrix
652 println!("\nDecision Matrix Example:");
653
654 let scenarios = vec![
655 (
656 "Web API Configuration",
657 "JSON",
658 "Human readable, web standard, small size",
659 ),
660 (
661 "ML Model Weights",
662 "Binary",
663 "Large numeric data, performance critical",
664 ),
665 (
666 "User Preferences",
667 "JSON",
668 "Human editable, self-documenting",
669 ),
670 (
671 "Real-time Telemetry",
672 "Binary",
673 "High frequency, bandwidth limited",
674 ),
675 (
676 "Application Settings",
677 "JSON",
678 "Developer accessible, version control friendly",
679 ),
680 (
681 "Scientific Dataset",
682 "Binary",
683 "Large arrays, storage efficiency critical",
684 ),
685 ];
686
687 for (scenario, recommendation, reason) in scenarios {
688 println!(" {} -> {} ({})", scenario, recommendation, reason);
689 }
690
691 // Create examples for common scenarios
692 println!("\nPractical Examples:");
693
694 // Configuration file example (JSON)
695 let config = Configuration {
696 version: "2.1.0".to_string(),
697 debug_enabled: false,
698 log_level: "info".to_string(),
699 database_settings: {
700 let mut map = HashMap::new();
701 map.insert("url".to_string(), "postgresql://localhost/app".to_string());
702 map.insert("pool_size".to_string(), "10".to_string());
703 map
704 },
705 feature_flags_enabled: true,
706 max_connections: 100.0,
707 timeout_seconds: 30.0,
708 };
709
710 config.save_json("temp_config_example.json")?;
711 let config_content = fs::read_to_string("temp_config_example.json")?;
712
713 println!("\nConfiguration File (JSON) - Human readable:");
714 for line in config_content.lines().take(5) {
715 println!(" {}", line);
716 }
717 println!(" ... (easily editable by developers)");
718
719 // Data export example (Binary)
720 let export_data = LargeDataset {
721 name: "Training Export".to_string(),
722 values: (0..1000).map(|i| (i as f32).sin()).collect(),
723 labels: (0..1000).map(|i| format!("sample_{:04}", i)).collect(),
724 feature_count: 50,
725 feature_dimension: 20,
726 timestamp_count: 1000,
727 metadata: HashMap::new(),
728 };
729
730 export_data.save_binary("temp_export_example.bin")?;
731 let export_size = fs::metadata("temp_export_example.bin")?.len();
732
733 println!("\nData Export (Binary) - Efficient storage:");
734 println!(
735 " File size: {} bytes ({:.1} KB)",
736 export_size,
737 export_size as f64 / 1024.0
738 );
739 println!(" 1000 numeric values + 50x20 matrix + metadata");
740 println!(" Compact encoding saves significant space vs JSON");
741
742 Ok(())
743}
744
745/// Demonstrate debugging capabilities
746fn demonstrate_debugging_capabilities() -> Result<(), Box<dyn std::error::Error>> {
747 println!("\n--- Debugging Capabilities ---");
748
749 let mut metadata = HashMap::new();
750 metadata.insert("debug_session".to_string(), "session_123".to_string());
751 metadata.insert("error_code".to_string(), "E001".to_string());
752
753 let debug_metrics = PerformanceMetrics {
754 operation: "debug_test".to_string(),
755 duration_micros: 5432,
756 memory_usage_bytes: 16384,
757 cpu_usage_percent: 42.7,
758 throughput_ops_per_sec: 750.0,
759 metadata,
760 };
761
762 println!("Debugging Comparison:");
763
764 // JSON debugging advantages
765 let json_data = debug_metrics.to_json()?;
766 println!("\nJSON Format - Debugging Advantages:");
767 println!(" ✓ Human readable without tools");
768 println!(" ✓ Can inspect values directly");
769 println!(" ✓ Text editors show structure");
770 println!(" ✓ Diff tools work naturally");
771 println!(" ✓ Version control friendly");
772
773 println!("\n Sample JSON output for debugging:");
774 for (i, line) in json_data.lines().enumerate() {
775 if i < 5 {
776 println!(" {}", line);
777 }
778 }
779
780 // Binary debugging limitations
781 let binary_data = debug_metrics.to_binary()?;
782 println!("\nBinary Format - Debugging Limitations:");
783 println!(" ✗ Requires special tools to inspect");
784 println!(" ✗ Not human readable");
785 println!(" ✗ Difficult to debug data corruption");
786 println!(" ✗ Version control shows as binary diff");
787
788 println!("\n Binary data (hex dump for debugging):");
789 print!(" ");
790 for (i, byte) in binary_data.iter().take(40).enumerate() {
791 if i > 0 && i % 16 == 0 {
792 println!();
793 print!(" ");
794 }
795 print!("{:02x} ", byte);
796 }
797 println!("\n (requires hex editor or custom tools)");
798
799 // Development workflow comparison
800 println!("\nDevelopment Workflow Impact:");
801
802 println!("\nJSON Workflow:");
803 println!(" 1. Save data to JSON file");
804 println!(" 2. Open in any text editor");
805 println!(" 3. Inspect values directly");
806 println!(" 4. Make manual edits if needed");
807 println!(" 5. Version control tracks changes");
808
809 println!("\nBinary Workflow:");
810 println!(" 1. Save data to binary file");
811 println!(" 2. Write debugging code to load and print");
812 println!(" 3. Use hex editor for low-level inspection");
813 println!(" 4. Cannot make manual edits easily");
814 println!(" 5. Version control shows binary changes only");
815
816 // Hybrid approach recommendation
817 println!("\nHybrid Approach for Development:");
818 println!(" - Use JSON during development/debugging");
819 println!(" - Switch to binary for production deployment");
820 println!(" - Provide debugging tools that export binary to JSON");
821 println!(" - Include format conversion utilities");
822
823 // Demonstrate debugging scenario
824 println!("\nDebugging Scenario Example:");
825 println!(" Problem: Performance metrics show unexpected values");
826
827 // Save both formats for comparison
828 debug_metrics.save_json("temp_debug_metrics.json")?;
829 debug_metrics.save_binary("temp_debug_metrics.bin")?;
830
831 println!(" JSON approach: Open temp_debug_metrics.json in editor");
832 println!(" -> Immediately see cpu_usage_percent: 42.7");
833 println!(" -> Compare with expected range");
834 println!(" -> Check metadata for debug_session: 'session_123'");
835
836 println!(" Binary approach: Write debugging code");
837 println!(" -> Load binary file programmatically");
838 println!(" -> Print values to console");
839 println!(" -> Additional development time required");
840
841 Ok(())
842}Sourcefn from_json(json: &str) -> SerializationResult<Self>
fn from_json(json: &str) -> SerializationResult<Self>
Creates the struct from a JSON string
Deserializes a JSON string into a new instance of the struct. The JSON should contain all required fields in the expected format.
§Arguments
json- JSON string containing the struct data
§Returns
Ok(Self) on successful deserialization
Err(SerializationError) if JSON parsing or deserialization fails
§Examples
Creates struct from JSON string with proper parsing and validation.
Examples found in repository?
91 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
92 // Try JSON object first
93 if let Ok(json_data) = value.as_json_object() {
94 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
95 field: field_name.to_string(),
96 message: format!("Failed to deserialize UserProfile from JSON: {}", e),
97 });
98 }
99
100 // Try binary object
101 if let Ok(binary_data) = value.as_binary_object() {
102 return Self::from_binary(binary_data).map_err(|e| {
103 SerializationError::ValidationFailed {
104 field: field_name.to_string(),
105 message: format!("Failed to deserialize UserProfile from binary: {}", e),
106 }
107 });
108 }
109
110 Err(SerializationError::ValidationFailed {
111 field: field_name.to_string(),
112 message: format!(
113 "Expected JsonObject or BinaryObject for UserProfile, found {}",
114 value.type_name()
115 ),
116 })
117 }
118}
119
120/// Application settings struct with optional fields and collections
121#[derive(Debug, Clone, PartialEq)]
122pub struct AppSettings {
123 pub app_name: String,
124 pub version: String,
125 pub debug_mode: bool,
126 pub max_connections: u32,
127 pub timeout_seconds: f32,
128 pub features: Vec<String>,
129 pub environment_vars: HashMap<String, String>,
130 pub optional_database_url: Option<String>,
131}
132
133impl StructSerializable for AppSettings {
134 fn to_serializer(&self) -> StructSerializer {
135 StructSerializer::new()
136 .field("app_name", &self.app_name)
137 .field("version", &self.version)
138 .field("debug_mode", &self.debug_mode)
139 .field("max_connections", &self.max_connections)
140 .field("timeout_seconds", &self.timeout_seconds)
141 .field("features", &self.features)
142 .field("environment_vars", &self.environment_vars)
143 .field("optional_database_url", &self.optional_database_url)
144 }
145
146 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
147 let app_name = deserializer.field("app_name")?;
148 let version = deserializer.field("version")?;
149 let debug_mode = deserializer.field("debug_mode")?;
150 let max_connections = deserializer.field("max_connections")?;
151 let timeout_seconds = deserializer.field("timeout_seconds")?;
152 let features = deserializer.field("features")?;
153 let environment_vars = deserializer.field("environment_vars")?;
154 let optional_database_url = deserializer.field("optional_database_url")?;
155
156 Ok(AppSettings {
157 app_name,
158 version,
159 debug_mode,
160 max_connections,
161 timeout_seconds,
162 features,
163 environment_vars,
164 optional_database_url,
165 })
166 }
167}
168
169impl ToFieldValue for AppSettings {
170 fn to_field_value(&self) -> FieldValue {
171 // Convert to JSON and then parse as FieldValue for nested object handling
172 match self.to_json() {
173 Ok(json_str) => FieldValue::from_json_object(json_str),
174 Err(_) => FieldValue::from_string("serialization_error".to_string()),
175 }
176 }
177}
178
179impl FromFieldValue for AppSettings {
180 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
181 // Try JSON object first
182 if let Ok(json_data) = value.as_json_object() {
183 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
184 field: field_name.to_string(),
185 message: format!("Failed to deserialize AppSettings from JSON: {}", e),
186 });
187 }
188
189 // Try binary object
190 if let Ok(binary_data) = value.as_binary_object() {
191 return Self::from_binary(binary_data).map_err(|e| {
192 SerializationError::ValidationFailed {
193 field: field_name.to_string(),
194 message: format!("Failed to deserialize AppSettings from binary: {}", e),
195 }
196 });
197 }
198
199 Err(SerializationError::ValidationFailed {
200 field: field_name.to_string(),
201 message: format!(
202 "Expected JsonObject or BinaryObject for AppSettings, found {}",
203 value.type_name()
204 ),
205 })
206 }
207}
208
209fn main() -> Result<(), Box<dyn std::error::Error>> {
210 println!("=== Basic Struct Serialization Example ===\n");
211
212 demonstrate_user_profile_serialization()?;
213 demonstrate_app_settings_serialization()?;
214 demonstrate_format_comparison()?;
215 demonstrate_roundtrip_verification()?;
216 demonstrate_field_access_patterns()?;
217 cleanup_temp_files()?;
218
219 println!("\n=== Example completed successfully! ===");
220 Ok(())
221}
222
223/// Demonstrate basic struct serialization with simple field types
224fn demonstrate_user_profile_serialization() -> Result<(), Box<dyn std::error::Error>> {
225 println!("--- User Profile Serialization ---");
226
227 // Create a user profile with various field types
228 let user = UserProfile {
229 id: 12345,
230 username: "alice_cooper".to_string(),
231 email: "alice@example.com".to_string(),
232 age: 28,
233 is_active: true,
234 score: 95.7,
235 };
236
237 println!("Original user profile:");
238 println!(" ID: {}", user.id);
239 println!(" Username: {}", user.username);
240 println!(" Email: {}", user.email);
241 println!(" Age: {}", user.age);
242 println!(" Active: {}", user.is_active);
243 println!(" Score: {}", user.score);
244
245 // Serialize to JSON
246 let json_data = user.to_json()?;
247 println!("\nSerialized to JSON:");
248 println!("{}", json_data);
249
250 // Save to JSON file
251 user.save_json("temp_user_profile.json")?;
252 println!("Saved to file: temp_user_profile.json");
253
254 // Load from JSON file
255 let loaded_user = UserProfile::load_json("temp_user_profile.json")?;
256 println!("\nLoaded user profile:");
257 println!(" ID: {}", loaded_user.id);
258 println!(" Username: {}", loaded_user.username);
259 println!(" Email: {}", loaded_user.email);
260 println!(" Age: {}", loaded_user.age);
261 println!(" Active: {}", loaded_user.is_active);
262 println!(" Score: {}", loaded_user.score);
263
264 // Verify data integrity
265 assert_eq!(user, loaded_user);
266 println!("Data integrity verification: PASSED");
267
268 Ok(())
269}
270
271/// Demonstrate serialization with collections and optional fields
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}
386
387/// Demonstrate roundtrip verification with multiple data variations
388fn demonstrate_roundtrip_verification() -> Result<(), Box<dyn std::error::Error>> {
389 println!("\n--- Roundtrip Verification ---");
390
391 // Test various data patterns
392 let test_users = [
393 UserProfile {
394 id: 0,
395 username: "".to_string(),
396 email: "empty@test.com".to_string(),
397 age: 0,
398 is_active: false,
399 score: 0.0,
400 },
401 UserProfile {
402 id: u32::MAX,
403 username: "maximal_user_with_very_long_name_123456789".to_string(),
404 email: "test@verylongdomainname.example.org".to_string(),
405 age: i32::MAX,
406 is_active: true,
407 score: 999999.5,
408 },
409 UserProfile {
410 id: 42,
411 username: "unicode_tëst_🦀".to_string(),
412 email: "unicode@tëst.com".to_string(),
413 age: 25,
414 is_active: true,
415 score: -123.456,
416 },
417 ];
418
419 println!(
420 "Testing roundtrip serialization with {} variations:",
421 test_users.len()
422 );
423
424 for (i, user) in test_users.iter().enumerate() {
425 println!(
426 " Test case {}: ID={}, Username='{}'",
427 i + 1,
428 user.id,
429 user.username
430 );
431
432 // JSON roundtrip
433 let json_data = user.to_json()?;
434 let json_parsed = UserProfile::from_json(&json_data)?;
435 assert_eq!(*user, json_parsed);
436
437 // Binary roundtrip
438 let binary_data = user.to_binary()?;
439 let binary_parsed = UserProfile::from_binary(&binary_data)?;
440 assert_eq!(*user, binary_parsed);
441
442 println!(" JSON roundtrip: PASSED");
443 println!(" Binary roundtrip: PASSED");
444 }
445
446 println!("All roundtrip tests: PASSED");
447
448 Ok(())
449}
450
451/// Demonstrate field access patterns and validation
452fn demonstrate_field_access_patterns() -> Result<(), Box<dyn std::error::Error>> {
453 println!("\n--- Field Access Patterns ---");
454
455 let settings = AppSettings {
456 app_name: "Field Test App".to_string(),
457 version: "2.1.0".to_string(),
458 debug_mode: false,
459 max_connections: 50,
460 timeout_seconds: 15.0,
461 features: vec!["basic".to_string(), "advanced".to_string()],
462 environment_vars: HashMap::new(),
463 optional_database_url: None,
464 };
465
466 // Convert to JSON to inspect structure
467 let json_data = settings.to_json()?;
468 println!("JSON structure for field inspection:");
469
470 // Count approximate fields by counting field separators
471 let field_count = json_data.matches(':').count();
472 println!("Estimated fields: {}", field_count);
473
474 // Show structure (first few lines)
475 let lines: Vec<&str> = json_data.lines().take(5).collect();
476 for line in lines {
477 println!(" {}", line.trim());
478 }
479 if json_data.lines().count() > 5 {
480 println!(" ... ({} more lines)", json_data.lines().count() - 5);
481 }
482
483 // Demonstrate optional field handling
484 println!("\nOptional field handling:");
485 println!(
486 " Database URL is None: {}",
487 settings.optional_database_url.is_none()
488 );
489
490 // Create version with optional field populated
491 let settings_with_db = AppSettings {
492 optional_database_url: Some("sqlite:///tmp/test.db".to_string()),
493 ..settings.clone()
494 };
495
496 println!(
497 " Database URL with value: {:?}",
498 settings_with_db.optional_database_url
499 );
500
501 // Verify both versions serialize/deserialize correctly
502 let json_none = settings.to_json()?;
503 let json_some = settings_with_db.to_json()?;
504
505 let parsed_none = AppSettings::from_json(&json_none)?;
506 let parsed_some = AppSettings::from_json(&json_some)?;
507
508 assert_eq!(settings, parsed_none);
509 assert_eq!(settings_with_db, parsed_some);
510 assert!(parsed_none.optional_database_url.is_none());
511 assert!(parsed_some.optional_database_url.is_some());
512
513 println!("Optional field serialization: PASSED");
514
515 Ok(())
516}More examples
83 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
84 // Try JSON object first
85 if let Ok(json_data) = value.as_json_object() {
86 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
87 field: field_name.to_string(),
88 message: format!("Failed to deserialize ContactInfo from JSON: {}", e),
89 });
90 }
91
92 // Try binary object
93 if let Ok(binary_data) = value.as_binary_object() {
94 return Self::from_binary(binary_data).map_err(|e| {
95 SerializationError::ValidationFailed {
96 field: field_name.to_string(),
97 message: format!("Failed to deserialize ContactInfo from binary: {}", e),
98 }
99 });
100 }
101
102 Err(SerializationError::ValidationFailed {
103 field: field_name.to_string(),
104 message: format!(
105 "Expected JsonObject or BinaryObject for ContactInfo, found {}",
106 value.type_name()
107 ),
108 })
109 }
110}
111
112/// Address struct
113#[derive(Debug, Clone, PartialEq)]
114pub struct Address {
115 pub street: String,
116 pub city: String,
117 pub state: String,
118 pub postal_code: String,
119 pub country: String,
120}
121
122impl StructSerializable for Address {
123 fn to_serializer(&self) -> StructSerializer {
124 StructSerializer::new()
125 .field("street", &self.street)
126 .field("city", &self.city)
127 .field("state", &self.state)
128 .field("postal_code", &self.postal_code)
129 .field("country", &self.country)
130 }
131
132 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
133 let street = deserializer.field("street")?;
134 let city = deserializer.field("city")?;
135 let state = deserializer.field("state")?;
136 let postal_code = deserializer.field("postal_code")?;
137 let country = deserializer.field("country")?;
138
139 Ok(Address {
140 street,
141 city,
142 state,
143 postal_code,
144 country,
145 })
146 }
147}
148
149impl ToFieldValue for Address {
150 fn to_field_value(&self) -> FieldValue {
151 match self.to_json() {
152 Ok(json_str) => FieldValue::from_json_object(json_str),
153 Err(_) => FieldValue::from_string("serialization_error".to_string()),
154 }
155 }
156}
157
158impl FromFieldValue for Address {
159 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
160 // Try JSON object first
161 if let Ok(json_data) = value.as_json_object() {
162 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
163 field: field_name.to_string(),
164 message: format!("Failed to deserialize Address from JSON: {}", e),
165 });
166 }
167
168 // Try binary object
169 if let Ok(binary_data) = value.as_binary_object() {
170 return Self::from_binary(binary_data).map_err(|e| {
171 SerializationError::ValidationFailed {
172 field: field_name.to_string(),
173 message: format!("Failed to deserialize Address from binary: {}", e),
174 }
175 });
176 }
177
178 Err(SerializationError::ValidationFailed {
179 field: field_name.to_string(),
180 message: format!(
181 "Expected JsonObject or BinaryObject for Address, found {}",
182 value.type_name()
183 ),
184 })
185 }
186}
187
188/// Project information struct
189#[derive(Debug, Clone, PartialEq)]
190pub struct Project {
191 pub name: String,
192 pub description: String,
193 pub status: ProjectStatus,
194 pub budget: f64,
195 pub team_members: Vec<String>,
196 pub milestones: Vec<Milestone>,
197 pub metadata: HashMap<String, String>,
198}
199
200impl StructSerializable for Project {
201 fn to_serializer(&self) -> StructSerializer {
202 StructSerializer::new()
203 .field("name", &self.name)
204 .field("description", &self.description)
205 .field("status", &self.status)
206 .field("budget", &self.budget)
207 .field("team_members", &self.team_members)
208 .field("milestones", &self.milestones)
209 .field("metadata", &self.metadata)
210 }
211
212 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
213 let name = deserializer.field("name")?;
214 let description = deserializer.field("description")?;
215 let status = deserializer.field("status")?;
216 let budget = deserializer.field("budget")?;
217 let team_members = deserializer.field("team_members")?;
218 let milestones = deserializer.field("milestones")?;
219 let metadata = deserializer.field("metadata")?;
220
221 Ok(Project {
222 name,
223 description,
224 status,
225 budget,
226 team_members,
227 milestones,
228 metadata,
229 })
230 }
231}
232
233impl ToFieldValue for Project {
234 fn to_field_value(&self) -> FieldValue {
235 match self.to_json() {
236 Ok(json_str) => FieldValue::from_json_object(json_str),
237 Err(_) => FieldValue::from_string("serialization_error".to_string()),
238 }
239 }
240}
241
242impl FromFieldValue for Project {
243 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
244 // Try JSON object first
245 if let Ok(json_data) = value.as_json_object() {
246 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
247 field: field_name.to_string(),
248 message: format!("Failed to deserialize Project from JSON: {}", e),
249 });
250 }
251
252 // Try binary object
253 if let Ok(binary_data) = value.as_binary_object() {
254 return Self::from_binary(binary_data).map_err(|e| {
255 SerializationError::ValidationFailed {
256 field: field_name.to_string(),
257 message: format!("Failed to deserialize Project from binary: {}", e),
258 }
259 });
260 }
261
262 Err(SerializationError::ValidationFailed {
263 field: field_name.to_string(),
264 message: format!(
265 "Expected JsonObject or BinaryObject for Project, found {}",
266 value.type_name()
267 ),
268 })
269 }
270}
271
272/// Project status enumeration
273#[derive(Debug, Clone, PartialEq)]
274pub enum ProjectStatus {
275 Planning,
276 InProgress,
277 OnHold,
278 Completed,
279 Cancelled,
280}
281
282impl ToFieldValue for ProjectStatus {
283 fn to_field_value(&self) -> FieldValue {
284 let status_str = match self {
285 ProjectStatus::Planning => "planning",
286 ProjectStatus::InProgress => "in_progress",
287 ProjectStatus::OnHold => "on_hold",
288 ProjectStatus::Completed => "completed",
289 ProjectStatus::Cancelled => "cancelled",
290 };
291 FieldValue::from_string(status_str.to_string())
292 }
293}
294
295impl FromFieldValue for ProjectStatus {
296 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
297 match value {
298 FieldValue::String(s) => match s.as_str() {
299 "planning" => Ok(ProjectStatus::Planning),
300 "in_progress" => Ok(ProjectStatus::InProgress),
301 "on_hold" => Ok(ProjectStatus::OnHold),
302 "completed" => Ok(ProjectStatus::Completed),
303 "cancelled" => Ok(ProjectStatus::Cancelled),
304 _ => Err(SerializationError::ValidationFailed {
305 field: field_name.to_string(),
306 message: format!("Unknown project status: {}", s),
307 }),
308 },
309 _ => Err(SerializationError::ValidationFailed {
310 field: field_name.to_string(),
311 message: format!(
312 "Expected String for ProjectStatus, found {}",
313 value.type_name()
314 ),
315 }),
316 }
317 }
318}
319
320/// Project milestone struct
321#[derive(Debug, Clone, PartialEq)]
322pub struct Milestone {
323 pub name: String,
324 pub description: String,
325 pub due_date: String, // Simplified as string for this example
326 pub is_completed: bool,
327 pub progress_percentage: f32,
328 pub dependencies: Vec<String>,
329}
330
331impl StructSerializable for Milestone {
332 fn to_serializer(&self) -> StructSerializer {
333 StructSerializer::new()
334 .field("name", &self.name)
335 .field("description", &self.description)
336 .field("due_date", &self.due_date)
337 .field("is_completed", &self.is_completed)
338 .field("progress_percentage", &self.progress_percentage)
339 .field("dependencies", &self.dependencies)
340 }
341
342 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
343 let name = deserializer.field("name")?;
344 let description = deserializer.field("description")?;
345 let due_date = deserializer.field("due_date")?;
346 let is_completed = deserializer.field("is_completed")?;
347 let progress_percentage = deserializer.field("progress_percentage")?;
348 let dependencies = deserializer.field("dependencies")?;
349
350 Ok(Milestone {
351 name,
352 description,
353 due_date,
354 is_completed,
355 progress_percentage,
356 dependencies,
357 })
358 }
359}
360
361impl ToFieldValue for Milestone {
362 fn to_field_value(&self) -> FieldValue {
363 match self.to_json() {
364 Ok(json_str) => FieldValue::from_json_object(json_str),
365 Err(_) => FieldValue::from_string("serialization_error".to_string()),
366 }
367 }
368}
369
370impl FromFieldValue for Milestone {
371 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
372 // Try JSON object first
373 if let Ok(json_data) = value.as_json_object() {
374 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
375 field: field_name.to_string(),
376 message: format!("Failed to deserialize Milestone from JSON: {}", e),
377 });
378 }
379
380 // Try binary object
381 if let Ok(binary_data) = value.as_binary_object() {
382 return Self::from_binary(binary_data).map_err(|e| {
383 SerializationError::ValidationFailed {
384 field: field_name.to_string(),
385 message: format!("Failed to deserialize Milestone from binary: {}", e),
386 }
387 });
388 }
389
390 Err(SerializationError::ValidationFailed {
391 field: field_name.to_string(),
392 message: format!(
393 "Expected JsonObject or BinaryObject for Milestone, found {}",
394 value.type_name()
395 ),
396 })
397 }
398}
399
400/// Company struct with basic collections and nesting
401#[derive(Debug, Clone, PartialEq)]
402pub struct Company {
403 pub name: String,
404 pub founded_year: i32,
405 pub headquarters_city: String,
406 pub headquarters_state: String,
407 pub employee_count: usize,
408 pub department_names: Vec<String>,
409 pub active_project_names: Vec<String>,
410 pub company_metadata: HashMap<String, String>,
411}
412
413impl StructSerializable for Company {
414 fn to_serializer(&self) -> StructSerializer {
415 StructSerializer::new()
416 .field("name", &self.name)
417 .field("founded_year", &self.founded_year)
418 .field("headquarters_city", &self.headquarters_city)
419 .field("headquarters_state", &self.headquarters_state)
420 .field("employee_count", &self.employee_count)
421 .field("department_names", &self.department_names)
422 .field("active_project_names", &self.active_project_names)
423 .field("company_metadata", &self.company_metadata)
424 }
425
426 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
427 let name = deserializer.field("name")?;
428 let founded_year = deserializer.field("founded_year")?;
429 let headquarters_city = deserializer.field("headquarters_city")?;
430 let headquarters_state = deserializer.field("headquarters_state")?;
431 let employee_count = deserializer.field("employee_count")?;
432 let department_names = deserializer.field("department_names")?;
433 let active_project_names = deserializer.field("active_project_names")?;
434 let company_metadata = deserializer.field("company_metadata")?;
435
436 Ok(Company {
437 name,
438 founded_year,
439 headquarters_city,
440 headquarters_state,
441 employee_count,
442 department_names,
443 active_project_names,
444 company_metadata,
445 })
446 }
447}
448
449impl ToFieldValue for Company {
450 fn to_field_value(&self) -> FieldValue {
451 match self.to_json() {
452 Ok(json_str) => FieldValue::from_json_object(json_str),
453 Err(_) => FieldValue::from_string("serialization_error".to_string()),
454 }
455 }
456}
457
458impl FromFieldValue for Company {
459 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
460 // Try JSON object first
461 if let Ok(json_data) = value.as_json_object() {
462 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
463 field: field_name.to_string(),
464 message: format!("Failed to deserialize Company from JSON: {}", e),
465 });
466 }
467
468 // Try binary object
469 if let Ok(binary_data) = value.as_binary_object() {
470 return Self::from_binary(binary_data).map_err(|e| {
471 SerializationError::ValidationFailed {
472 field: field_name.to_string(),
473 message: format!("Failed to deserialize Company from binary: {}", e),
474 }
475 });
476 }
477
478 Err(SerializationError::ValidationFailed {
479 field: field_name.to_string(),
480 message: format!(
481 "Expected JsonObject or BinaryObject for Company, found {}",
482 value.type_name()
483 ),
484 })
485 }
486}
487
488/// Department struct
489#[derive(Debug, Clone, PartialEq)]
490pub struct Department {
491 pub name: String,
492 pub manager: String,
493 pub employee_count: u32,
494 pub budget: f64,
495 pub office_locations: Vec<Address>,
496}
497
498impl StructSerializable for Department {
499 fn to_serializer(&self) -> StructSerializer {
500 StructSerializer::new()
501 .field("name", &self.name)
502 .field("manager", &self.manager)
503 .field("employee_count", &self.employee_count)
504 .field("budget", &self.budget)
505 .field("office_locations", &self.office_locations)
506 }
507
508 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
509 let name = deserializer.field("name")?;
510 let manager = deserializer.field("manager")?;
511 let employee_count = deserializer.field("employee_count")?;
512 let budget = deserializer.field("budget")?;
513 let office_locations = deserializer.field("office_locations")?;
514
515 Ok(Department {
516 name,
517 manager,
518 employee_count,
519 budget,
520 office_locations,
521 })
522 }
523}
524
525impl ToFieldValue for Department {
526 fn to_field_value(&self) -> FieldValue {
527 match self.to_json() {
528 Ok(json_str) => FieldValue::from_json_object(json_str),
529 Err(_) => FieldValue::from_string("serialization_error".to_string()),
530 }
531 }
532}
533
534impl FromFieldValue for Department {
535 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
536 // Try JSON object first
537 if let Ok(json_data) = value.as_json_object() {
538 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
539 field: field_name.to_string(),
540 message: format!("Failed to deserialize Department from JSON: {}", e),
541 });
542 }
543
544 // Try binary object
545 if let Ok(binary_data) = value.as_binary_object() {
546 return Self::from_binary(binary_data).map_err(|e| {
547 SerializationError::ValidationFailed {
548 field: field_name.to_string(),
549 message: format!("Failed to deserialize Department from binary: {}", e),
550 }
551 });
552 }
553
554 Err(SerializationError::ValidationFailed {
555 field: field_name.to_string(),
556 message: format!(
557 "Expected JsonObject or BinaryObject for Department, found {}",
558 value.type_name()
559 ),
560 })
561 }
562}
563
564fn main() -> Result<(), Box<dyn std::error::Error>> {
565 println!("=== Nested Structures Serialization Example ===\n");
566
567 demonstrate_nested_struct_creation()?;
568 demonstrate_deep_serialization()?;
569 demonstrate_collection_nesting()?;
570 demonstrate_partial_loading()?;
571 demonstrate_performance_analysis()?;
572 cleanup_temp_files()?;
573
574 println!("\n=== Example completed successfully! ===");
575 Ok(())
576}
577
578/// Demonstrate creating complex nested structures
579fn demonstrate_nested_struct_creation() -> Result<(), Box<dyn std::error::Error>> {
580 println!("--- Nested Structure Creation ---");
581
582 // Create nested address and contact info
583 let headquarters = Address {
584 street: "123 Innovation Drive".to_string(),
585 city: "Tech City".to_string(),
586 state: "CA".to_string(),
587 postal_code: "94000".to_string(),
588 country: "USA".to_string(),
589 };
590
591 let mut social_media = HashMap::new();
592 social_media.insert("twitter".to_string(), "@techcorp".to_string());
593 social_media.insert("linkedin".to_string(), "techcorp-inc".to_string());
594
595 let contact_info = ContactInfo {
596 email: "info@techcorp.com".to_string(),
597 phone: Some("+1-555-0123".to_string()),
598 address_city: headquarters.city.clone(),
599 address_state: headquarters.state.clone(),
600 social_media,
601 };
602
603 // Create departments with nested office locations
604 let engineering_office = Address {
605 street: "456 Developer Lane".to_string(),
606 city: "Code City".to_string(),
607 state: "CA".to_string(),
608 postal_code: "94001".to_string(),
609 country: "USA".to_string(),
610 };
611
612 let departments = [
613 Department {
614 name: "Engineering".to_string(),
615 manager: "Alice Johnson".to_string(),
616 employee_count: 50,
617 budget: 2500000.0,
618 office_locations: vec![engineering_office, headquarters.clone()],
619 },
620 Department {
621 name: "Marketing".to_string(),
622 manager: "Bob Smith".to_string(),
623 employee_count: 15,
624 budget: 800000.0,
625 office_locations: vec![headquarters.clone()],
626 },
627 ];
628
629 // Create projects with milestones
630 let milestones = vec![
631 Milestone {
632 name: "Requirements Analysis".to_string(),
633 description: "Complete system requirements documentation".to_string(),
634 due_date: "2024-03-15".to_string(),
635 is_completed: true,
636 progress_percentage: 100.0,
637 dependencies: vec![],
638 },
639 Milestone {
640 name: "Architecture Design".to_string(),
641 description: "Define system architecture and components".to_string(),
642 due_date: "2024-04-01".to_string(),
643 is_completed: false,
644 progress_percentage: 75.0,
645 dependencies: vec!["Requirements Analysis".to_string()],
646 },
647 ];
648
649 let mut project_metadata = HashMap::new();
650 project_metadata.insert("priority".to_string(), "high".to_string());
651 project_metadata.insert("client".to_string(), "internal".to_string());
652
653 let projects = [Project {
654 name: "Train Station ML Platform".to_string(),
655 description: "Next-generation machine learning infrastructure".to_string(),
656 status: ProjectStatus::InProgress,
657 budget: 1500000.0,
658 team_members: vec![
659 "Alice Johnson".to_string(),
660 "Charlie Brown".to_string(),
661 "Diana Prince".to_string(),
662 ],
663 milestones: milestones.clone(),
664 metadata: project_metadata,
665 }];
666
667 // Create the complete company structure
668 let mut company_metadata = HashMap::new();
669 company_metadata.insert("industry".to_string(), "technology".to_string());
670 company_metadata.insert("stock_symbol".to_string(), "TECH".to_string());
671
672 let company = Company {
673 name: "TechCorp Inc.".to_string(),
674 founded_year: 2015,
675 headquarters_city: headquarters.city.clone(),
676 headquarters_state: headquarters.state.clone(),
677 employee_count: 250,
678 department_names: departments.iter().map(|d| d.name.clone()).collect(),
679 active_project_names: projects.iter().map(|p| p.name.clone()).collect(),
680 company_metadata,
681 };
682
683 println!("Created complex company structure:");
684 println!(" Company: {}", company.name);
685 println!(" Founded: {}", company.founded_year);
686 println!(
687 " Headquarters: {}, {}",
688 company.headquarters_city, company.headquarters_state
689 );
690 println!(" Employee Count: {}", company.employee_count);
691 println!(" Departments: {}", company.department_names.len());
692 println!(" Active Projects: {}", company.active_project_names.len());
693
694 // Save the complete structure
695 company.save_json("temp_nested_company.json")?;
696 println!("Saved nested structure to: temp_nested_company.json");
697
698 // Verify loading preserves all nested data
699 let loaded_company = Company::load_json("temp_nested_company.json")?;
700 assert_eq!(company, loaded_company);
701 println!("Successfully verified Company roundtrip serialization");
702
703 // Also demonstrate individual component serialization
704 let address_json = headquarters.to_json()?;
705 let loaded_address = Address::from_json(&address_json)?;
706 assert_eq!(headquarters, loaded_address);
707 println!("Successfully serialized/deserialized Address component");
708
709 let contact_json = contact_info.to_json()?;
710 let loaded_contact = ContactInfo::from_json(&contact_json)?;
711 assert_eq!(contact_info, loaded_contact);
712 println!("Successfully serialized/deserialized ContactInfo component");
713 println!("Nested structure integrity: VERIFIED");
714
715 Ok(())
716}
717
718/// Demonstrate deep serialization with complex nesting
719fn demonstrate_deep_serialization() -> Result<(), Box<dyn std::error::Error>> {
720 println!("\n--- Deep Serialization Analysis ---");
721
722 let deep_milestone = Milestone {
723 name: "Deep Milestone".to_string(),
724 description: "Testing deep nesting serialization".to_string(),
725 due_date: "2024-12-31".to_string(),
726 is_completed: false,
727 progress_percentage: 50.0,
728 dependencies: vec!["Parent Task".to_string(), "Sibling Task".to_string()],
729 };
730
731 let deep_project = Project {
732 name: "Deep Nesting Test".to_string(),
733 description: "Project for testing serialization depth".to_string(),
734 status: ProjectStatus::Planning,
735 budget: 100000.0,
736 team_members: vec!["Developer 1".to_string(), "Developer 2".to_string()],
737 milestones: vec![deep_milestone],
738 metadata: HashMap::new(),
739 };
740
741 // Analyze serialization output
742 let json_output = deep_project.to_json()?;
743 let binary_output = deep_project.to_binary()?;
744
745 println!("Deep structure serialization analysis:");
746 println!(" JSON size: {} bytes", json_output.len());
747 println!(" Binary size: {} bytes", binary_output.len());
748 println!(" Nesting levels: Address -> Project -> Milestone -> Dependencies");
749
750 // Count nested objects in JSON (rough estimate)
751 let object_count = json_output.matches('{').count();
752 let array_count = json_output.matches('[').count();
753 println!(" JSON objects: {}", object_count);
754 println!(" JSON arrays: {}", array_count);
755
756 // Verify deep roundtrip
757 let json_parsed = Project::from_json(&json_output)?;
758 let binary_parsed = Project::from_binary(&binary_output)?;
759
760 assert_eq!(deep_project, json_parsed);
761 assert_eq!(deep_project, binary_parsed);
762 println!("Deep serialization roundtrip: VERIFIED");
763
764 Ok(())
765}
766
767/// Demonstrate collection nesting patterns
768fn demonstrate_collection_nesting() -> Result<(), Box<dyn std::error::Error>> {
769 println!("\n--- Collection Nesting Patterns ---");
770
771 // Create multiple departments with varying complexity
772 let departments = vec![
773 Department {
774 name: "Research".to_string(),
775 manager: "Dr. Science".to_string(),
776 employee_count: 25,
777 budget: 1200000.0,
778 office_locations: vec![
779 Address {
780 street: "1 Research Blvd".to_string(),
781 city: "Innovation Hub".to_string(),
782 state: "MA".to_string(),
783 postal_code: "02101".to_string(),
784 country: "USA".to_string(),
785 },
786 Address {
787 street: "2 Lab Street".to_string(),
788 city: "Tech Valley".to_string(),
789 state: "NY".to_string(),
790 postal_code: "12180".to_string(),
791 country: "USA".to_string(),
792 },
793 ],
794 },
795 Department {
796 name: "Quality Assurance".to_string(),
797 manager: "Test Master".to_string(),
798 employee_count: 12,
799 budget: 600000.0,
800 office_locations: vec![], // Empty collection
801 },
802 ];
803
804 println!("Collection nesting analysis:");
805 println!(" Departments: {}", departments.len());
806
807 let total_locations: usize = departments.iter().map(|d| d.office_locations.len()).sum();
808 println!(" Total office locations: {}", total_locations);
809
810 // Test serialization with mixed empty and populated collections
811 // Note: Vec<Department> doesn't implement StructSerializable directly.
812 // For this example, we'll serialize each department individually
813 let department_json_strings: Result<Vec<String>, _> =
814 departments.iter().map(|dept| dept.to_json()).collect();
815 let department_json_strings = department_json_strings?;
816
817 // Deserialize each department back
818 let parsed_departments: Result<Vec<Department>, _> = department_json_strings
819 .iter()
820 .map(|json_str| Department::from_json(json_str))
821 .collect();
822 let parsed_departments = parsed_departments?;
823
824 assert_eq!(departments, parsed_departments);
825 println!("Collection nesting serialization: VERIFIED");
826
827 // Analyze collection patterns
828 for (i, dept) in departments.iter().enumerate() {
829 println!(
830 " Department {}: {} locations",
831 i + 1,
832 dept.office_locations.len()
833 );
834 }
835
836 Ok(())
837}
838
839/// Demonstrate partial loading and field access
840fn demonstrate_partial_loading() -> Result<(), Box<dyn std::error::Error>> {
841 println!("\n--- Partial Loading and Field Access ---");
842
843 // Create a simple project for analysis
844 let project = Project {
845 name: "Sample Project".to_string(),
846 description: "For testing partial loading".to_string(),
847 status: ProjectStatus::InProgress,
848 budget: 50000.0,
849 team_members: vec!["Alice".to_string(), "Bob".to_string()],
850 milestones: vec![Milestone {
851 name: "Phase 1".to_string(),
852 description: "Initial phase".to_string(),
853 due_date: "2024-06-01".to_string(),
854 is_completed: true,
855 progress_percentage: 100.0,
856 dependencies: vec![],
857 }],
858 metadata: HashMap::new(),
859 };
860
861 // Convert to JSON and analyze structure
862 println!("Project JSON structure analysis:");
863
864 // Parse to examine available fields by inspecting JSON structure
865 let json_data = project.to_json()?;
866 let field_count = json_data.matches(':').count();
867 println!(" Estimated fields: {}", field_count);
868
869 // Show top-level structure
870 let lines: Vec<&str> = json_data.lines().take(10).collect();
871 println!(" JSON structure preview:");
872 for line in lines.iter().take(5) {
873 if let Some(colon_pos) = line.find(':') {
874 let field_name = line[..colon_pos].trim().trim_matches('"').trim();
875 if !field_name.is_empty() {
876 println!(" - {}", field_name);
877 }
878 }
879 }
880
881 // Demonstrate field type analysis
882 println!("\nField type analysis:");
883 println!(" name: String");
884 println!(" status: Enum -> String");
885 println!(" budget: f64 -> Number");
886 println!(" team_members: Vec<String> -> Array");
887 println!(" milestones: Vec<Milestone> -> Array of Objects");
888
889 Ok(())
890}
891
892/// Demonstrate performance analysis for nested structures
893fn demonstrate_performance_analysis() -> Result<(), Box<dyn std::error::Error>> {
894 println!("\n--- Performance Analysis ---");
895
896 // Create structures of varying complexity
897 let simple_address = Address {
898 street: "123 Main St".to_string(),
899 city: "Anytown".to_string(),
900 state: "ST".to_string(),
901 postal_code: "12345".to_string(),
902 country: "USA".to_string(),
903 };
904
905 let complex_department = Department {
906 name: "Complex Department".to_string(),
907 manager: "Manager Name".to_string(),
908 employee_count: 100,
909 budget: 5000000.0,
910 office_locations: vec![simple_address.clone(); 10], // 10 identical addresses
911 };
912
913 let complex_project = Project {
914 name: "Complex Project".to_string(),
915 description: "Large project with many components".to_string(),
916 status: ProjectStatus::InProgress,
917 budget: 2000000.0,
918 team_members: (1..=50).map(|i| format!("Team Member {}", i)).collect(),
919 milestones: (1..=20)
920 .map(|i| Milestone {
921 name: format!("Milestone {}", i),
922 description: format!("Description for milestone {}", i),
923 due_date: "2024-12-31".to_string(),
924 is_completed: i <= 10,
925 progress_percentage: if i <= 10 { 100.0 } else { 50.0 },
926 dependencies: if i > 1 {
927 vec![format!("Milestone {}", i - 1)]
928 } else {
929 vec![]
930 },
931 })
932 .collect(),
933 metadata: HashMap::new(),
934 };
935
936 // Measure serialization performance
937 println!("Performance comparison:");
938
939 // Simple address
940 let addr_json = simple_address.to_json()?;
941 let addr_binary = simple_address.to_binary()?;
942 println!(" Simple Address:");
943 println!(" JSON: {} bytes", addr_json.len());
944 println!(" Binary: {} bytes", addr_binary.len());
945
946 // Complex department
947 let dept_json = complex_department.to_json()?;
948 let dept_binary = complex_department.to_binary()?;
949 println!(" Complex Department (10 addresses):");
950 println!(" JSON: {} bytes", dept_json.len());
951 println!(" Binary: {} bytes", dept_binary.len());
952
953 // Complex project
954 let proj_json = complex_project.to_json()?;
955 let proj_binary = complex_project.to_binary()?;
956 println!(" Complex Project (50 members, 20 milestones):");
957 println!(" JSON: {} bytes", proj_json.len());
958 println!(" Binary: {} bytes", proj_binary.len());
959
960 // Calculate efficiency ratios
961 let dept_ratio = dept_json.len() as f64 / dept_binary.len() as f64;
962 let proj_ratio = proj_json.len() as f64 / proj_binary.len() as f64;
963
964 println!("\nFormat efficiency (JSON/Binary ratio):");
965 println!(" Department: {:.2}x", dept_ratio);
966 println!(" Project: {:.2}x", proj_ratio);
967
968 // Verify complex structure roundtrip
969 let proj_json_parsed = Project::from_json(&proj_json)?;
970 let proj_binary_parsed = Project::from_binary(&proj_binary)?;
971
972 assert_eq!(complex_project, proj_json_parsed);
973 assert_eq!(complex_project, proj_binary_parsed);
974 println!("Complex structure roundtrip: VERIFIED");
975
976 Ok(())
977}94 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
95 // Try JSON object first
96 if let Ok(json_data) = value.as_json_object() {
97 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
98 field: field_name.to_string(),
99 message: format!("Failed to deserialize VersionedData from JSON: {}", e),
100 });
101 }
102
103 // Try binary object
104 if let Ok(binary_data) = value.as_binary_object() {
105 return Self::from_binary(binary_data).map_err(|e| {
106 SerializationError::ValidationFailed {
107 field: field_name.to_string(),
108 message: format!("Failed to deserialize VersionedData from binary: {}", e),
109 }
110 });
111 }
112
113 Err(SerializationError::ValidationFailed {
114 field: field_name.to_string(),
115 message: format!(
116 "Expected JsonObject or BinaryObject for VersionedData, found {}",
117 value.type_name()
118 ),
119 })
120 }
121}
122
123/// Validated user input with constraints
124#[derive(Debug, Clone, PartialEq)]
125pub struct ValidatedUserInput {
126 pub username: String,
127 pub email: String,
128 pub age: u16,
129 pub preferences: HashMap<String, String>,
130}
131
132impl StructSerializable for ValidatedUserInput {
133 fn to_serializer(&self) -> StructSerializer {
134 StructSerializer::new()
135 .field("username", &self.username)
136 .field("email", &self.email)
137 .field("age", &self.age)
138 .field("preferences", &self.preferences)
139 }
140
141 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
142 let username: String = deserializer.field("username")?;
143 let email: String = deserializer.field("email")?;
144 let age: u16 = deserializer.field("age")?;
145 let preferences: HashMap<String, String> = deserializer.field("preferences")?;
146
147 // Validate username
148 if username.is_empty() || username.len() > 50 {
149 return Err(SerializationError::ValidationFailed {
150 field: "username".to_string(),
151 message: "Username must be 1-50 characters long".to_string(),
152 });
153 }
154
155 if !username
156 .chars()
157 .all(|c| c.is_alphanumeric() || c == '_' || c == '-')
158 {
159 return Err(SerializationError::ValidationFailed {
160 field: "username".to_string(),
161 message:
162 "Username can only contain alphanumeric characters, underscores, and hyphens"
163 .to_string(),
164 });
165 }
166
167 // Validate email (basic check)
168 if !email.contains('@') || !email.contains('.') || email.len() < 5 {
169 return Err(SerializationError::ValidationFailed {
170 field: "email".to_string(),
171 message: "Invalid email format".to_string(),
172 });
173 }
174
175 // Validate age
176 if !(13..=120).contains(&age) {
177 return Err(SerializationError::ValidationFailed {
178 field: "age".to_string(),
179 message: "Age must be between 13 and 120".to_string(),
180 });
181 }
182
183 // Validate preferences
184 if preferences.len() > 20 {
185 return Err(SerializationError::ValidationFailed {
186 field: "preferences".to_string(),
187 message: "Too many preferences (maximum 20)".to_string(),
188 });
189 }
190
191 for (key, value) in &preferences {
192 if key.len() > 50 || value.len() > 200 {
193 return Err(SerializationError::ValidationFailed {
194 field: "preferences".to_string(),
195 message: format!("Preference key/value too long: {}", key),
196 });
197 }
198 }
199
200 Ok(ValidatedUserInput {
201 username,
202 email,
203 age,
204 preferences,
205 })
206 }
207}
208
209impl ToFieldValue for ValidatedUserInput {
210 fn to_field_value(&self) -> FieldValue {
211 match self.to_json() {
212 Ok(json_str) => FieldValue::from_json_object(json_str),
213 Err(_) => FieldValue::from_string("serialization_error".to_string()),
214 }
215 }
216}
217
218impl FromFieldValue for ValidatedUserInput {
219 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
220 // Try JSON object first
221 if let Ok(json_data) = value.as_json_object() {
222 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
223 field: field_name.to_string(),
224 message: format!("Failed to deserialize ValidatedUserInput from JSON: {}", e),
225 });
226 }
227
228 // Try binary object
229 if let Ok(binary_data) = value.as_binary_object() {
230 return Self::from_binary(binary_data).map_err(|e| {
231 SerializationError::ValidationFailed {
232 field: field_name.to_string(),
233 message: format!(
234 "Failed to deserialize ValidatedUserInput from binary: {}",
235 e
236 ),
237 }
238 });
239 }
240
241 Err(SerializationError::ValidationFailed {
242 field: field_name.to_string(),
243 message: format!(
244 "Expected JsonObject or BinaryObject for ValidatedUserInput, found {}",
245 value.type_name()
246 ),
247 })
248 }
249}
250
251/// Recovery helper for handling partial data
252#[derive(Debug, Clone, PartialEq)]
253pub struct RecoverableData {
254 pub critical_field: String,
255 pub important_field: Option<String>,
256 pub optional_field: Option<String>,
257 pub metadata: HashMap<String, String>,
258}
259
260impl StructSerializable for RecoverableData {
261 fn to_serializer(&self) -> StructSerializer {
262 StructSerializer::new()
263 .field("critical_field", &self.critical_field)
264 .field("important_field", &self.important_field)
265 .field("optional_field", &self.optional_field)
266 .field("metadata", &self.metadata)
267 }
268
269 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
270 // Critical field - must exist
271 let critical_field = deserializer.field("critical_field")?;
272
273 // Important field - try to recover if missing
274 let important_field = deserializer.field_optional("important_field")?;
275
276 // Optional field - graceful fallback
277 let optional_field = deserializer.field_optional("optional_field")?;
278
279 // Metadata - recover what we can
280 let metadata = deserializer.field_or("metadata", HashMap::new())?;
281
282 Ok(RecoverableData {
283 critical_field,
284 important_field,
285 optional_field,
286 metadata,
287 })
288 }
289}
290
291impl ToFieldValue for RecoverableData {
292 fn to_field_value(&self) -> FieldValue {
293 match self.to_json() {
294 Ok(json_str) => FieldValue::from_json_object(json_str),
295 Err(_) => FieldValue::from_string("serialization_error".to_string()),
296 }
297 }
298}
299
300impl FromFieldValue for RecoverableData {
301 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
302 // Try JSON object first
303 if let Ok(json_data) = value.as_json_object() {
304 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
305 field: field_name.to_string(),
306 message: format!("Failed to deserialize RecoverableData from JSON: {}", e),
307 });
308 }
309
310 // Try binary object
311 if let Ok(binary_data) = value.as_binary_object() {
312 return Self::from_binary(binary_data).map_err(|e| {
313 SerializationError::ValidationFailed {
314 field: field_name.to_string(),
315 message: format!("Failed to deserialize RecoverableData from binary: {}", e),
316 }
317 });
318 }
319
320 Err(SerializationError::ValidationFailed {
321 field: field_name.to_string(),
322 message: format!(
323 "Expected JsonObject or BinaryObject for RecoverableData, found {}",
324 value.type_name()
325 ),
326 })
327 }
328}
329
330fn main() -> Result<(), Box<dyn std::error::Error>> {
331 println!("=== Error Handling and Validation Example ===\n");
332
333 demonstrate_common_error_scenarios()?;
334 demonstrate_validation_patterns()?;
335 demonstrate_schema_evolution()?;
336 demonstrate_recovery_strategies()?;
337 demonstrate_production_error_handling()?;
338 cleanup_temp_files()?;
339
340 println!("\n=== Example completed successfully! ===");
341 Ok(())
342}
343
344/// Demonstrate common serialization error scenarios
345fn demonstrate_common_error_scenarios() -> Result<(), Box<dyn std::error::Error>> {
346 println!("--- Common Error Scenarios ---");
347
348 fs::create_dir_all("temp_error_tests")?;
349
350 // Scenario 1: Corrupted JSON file
351 println!("1. Corrupted JSON File:");
352 let corrupted_json = r#"{"name": "test", "value": 42, "incomplete"#;
353 fs::write("temp_error_tests/corrupted.json", corrupted_json)?;
354
355 match VersionedData::load_json("temp_error_tests/corrupted.json") {
356 Ok(_) => println!(" Unexpected: Corrupted JSON was parsed successfully"),
357 Err(e) => println!(" Expected error: {}", e),
358 }
359
360 // Scenario 2: Missing required fields
361 println!("\n2. Missing Required Fields:");
362 let incomplete_json = r#"{"name": "test"}"#;
363 fs::write("temp_error_tests/incomplete.json", incomplete_json)?;
364
365 match VersionedData::load_json("temp_error_tests/incomplete.json") {
366 Ok(_) => println!(" Unexpected: Incomplete JSON was parsed successfully"),
367 Err(e) => println!(" Expected error: {}", e),
368 }
369
370 // Scenario 3: Type mismatches
371 println!("\n3. Type Mismatch:");
372 let type_mismatch_json = r#"{"version": "not_a_number", "name": "test", "value": 42.0}"#;
373 fs::write("temp_error_tests/type_mismatch.json", type_mismatch_json)?;
374
375 match VersionedData::load_json("temp_error_tests/type_mismatch.json") {
376 Ok(_) => println!(" Unexpected: Type mismatch was handled gracefully"),
377 Err(e) => println!(" Expected error: {}", e),
378 }
379
380 // Scenario 4: File not found
381 println!("\n4. File Not Found:");
382 match VersionedData::load_json("temp_error_tests/nonexistent.json") {
383 Ok(_) => println!(" Unexpected: Non-existent file was loaded"),
384 Err(e) => println!(" Expected error: {}", e),
385 }
386
387 // Scenario 5: Binary format mismatch
388 println!("\n5. Binary Format Mismatch:");
389 let invalid_binary = vec![0xFF, 0xFF, 0xFF, 0xFF]; // Invalid binary data
390 fs::write("temp_error_tests/invalid.bin", invalid_binary)?;
391
392 match VersionedData::load_binary("temp_error_tests/invalid.bin") {
393 Ok(_) => println!(" Unexpected: Invalid binary was parsed successfully"),
394 Err(e) => println!(" Expected error: {}", e),
395 }
396
397 // Scenario 6: Wrong format loading
398 println!("\n6. Wrong Format Loading:");
399 let valid_data = VersionedData {
400 version: 1,
401 name: "test".to_string(),
402 value: 42.0,
403 optional_field: None,
404 new_field: None,
405 };
406 valid_data.save_binary("temp_error_tests/valid.bin")?;
407
408 // Try to load binary file as JSON
409 match VersionedData::load_json("temp_error_tests/valid.bin") {
410 Ok(_) => println!(" Unexpected: Binary file was loaded as JSON"),
411 Err(e) => println!(" Expected error: {}", e),
412 }
413
414 Ok(())
415}
416
417/// Demonstrate validation patterns
418fn demonstrate_validation_patterns() -> Result<(), Box<dyn std::error::Error>> {
419 println!("\n--- Validation Patterns ---");
420
421 println!("Testing input validation with various scenarios:");
422
423 // Valid input
424 println!("\n1. Valid Input:");
425 let mut valid_preferences = HashMap::new();
426 valid_preferences.insert("theme".to_string(), "dark".to_string());
427 valid_preferences.insert("language".to_string(), "en".to_string());
428
429 let valid_input = ValidatedUserInput {
430 username: "john_doe".to_string(),
431 email: "john@example.com".to_string(),
432 age: 25,
433 preferences: valid_preferences,
434 };
435
436 match valid_input.to_json() {
437 Ok(json) => {
438 println!(" ✓ Valid input serialized successfully");
439 match ValidatedUserInput::from_json(&json) {
440 Ok(_) => println!(" ✓ Valid input deserialized successfully"),
441 Err(e) => println!(" ✗ Deserialization failed: {}", e),
442 }
443 }
444 Err(e) => println!(" ✗ Serialization failed: {}", e),
445 }
446
447 // Test validation errors
448 let validation_tests = vec![
449 (
450 "Empty username",
451 ValidatedUserInput {
452 username: "".to_string(),
453 email: "test@example.com".to_string(),
454 age: 25,
455 preferences: HashMap::new(),
456 },
457 ),
458 (
459 "Invalid username characters",
460 ValidatedUserInput {
461 username: "user@name!".to_string(),
462 email: "test@example.com".to_string(),
463 age: 25,
464 preferences: HashMap::new(),
465 },
466 ),
467 (
468 "Invalid email",
469 ValidatedUserInput {
470 username: "username".to_string(),
471 email: "invalid_email".to_string(),
472 age: 25,
473 preferences: HashMap::new(),
474 },
475 ),
476 (
477 "Age too low",
478 ValidatedUserInput {
479 username: "username".to_string(),
480 email: "test@example.com".to_string(),
481 age: 10,
482 preferences: HashMap::new(),
483 },
484 ),
485 (
486 "Age too high",
487 ValidatedUserInput {
488 username: "username".to_string(),
489 email: "test@example.com".to_string(),
490 age: 150,
491 preferences: HashMap::new(),
492 },
493 ),
494 ];
495
496 for (description, invalid_input) in validation_tests {
497 println!("\n2. {}:", description);
498 match invalid_input.to_json() {
499 Ok(json) => match ValidatedUserInput::from_json(&json) {
500 Ok(_) => println!(" ✗ Unexpected: Invalid input was accepted"),
501 Err(e) => println!(" ✓ Expected validation error: {}", e),
502 },
503 Err(e) => println!(" ✗ Serialization error: {}", e),
504 }
505 }
506
507 // Test preferences validation
508 println!("\n3. Preferences Validation:");
509 let mut too_many_preferences = HashMap::new();
510 for i in 0..25 {
511 too_many_preferences.insert(format!("pref_{}", i), "value".to_string());
512 }
513
514 let invalid_prefs_input = ValidatedUserInput {
515 username: "username".to_string(),
516 email: "test@example.com".to_string(),
517 age: 25,
518 preferences: too_many_preferences,
519 };
520
521 match invalid_prefs_input.to_json() {
522 Ok(json) => match ValidatedUserInput::from_json(&json) {
523 Ok(_) => println!(" ✗ Unexpected: Too many preferences were accepted"),
524 Err(e) => println!(" ✓ Expected validation error: {}", e),
525 },
526 Err(e) => println!(" ✗ Serialization error: {}", e),
527 }
528
529 Ok(())
530}
531
532/// Demonstrate schema evolution patterns
533fn demonstrate_schema_evolution() -> Result<(), Box<dyn std::error::Error>> {
534 println!("\n--- Schema Evolution Patterns ---");
535
536 fs::create_dir_all("temp_schema_tests")?;
537
538 // Create data with different schema versions
539 println!("Creating data with different schema versions:");
540
541 // Version 1 data (minimal)
542 let v1_json = r#"{
543 "version": 1,
544 "name": "legacy_data",
545 "value": 123.45
546 }"#;
547 fs::write("temp_schema_tests/v1_data.json", v1_json)?;
548 println!(" ✓ Version 1 data created (minimal fields)");
549
550 // Version 2 data (with optional field)
551 let v2_json = r#"{
552 "version": 2,
553 "name": "v2_data",
554 "value": 678.90,
555 "optional_field": "added_in_v2"
556 }"#;
557 fs::write("temp_schema_tests/v2_data.json", v2_json)?;
558 println!(" ✓ Version 2 data created (with optional field)");
559
560 // Version 3 data (with all fields)
561 let v3_data = VersionedData {
562 version: 3,
563 name: "v3_data".to_string(),
564 value: 999.99,
565 optional_field: Some("present".to_string()),
566 new_field: Some(42),
567 };
568 v3_data.save_json("temp_schema_tests/v3_data.json")?;
569 println!(" ✓ Version 3 data created (all fields)");
570
571 // Test backward compatibility
572 println!("\nTesting backward compatibility:");
573
574 // Load v1 data with current deserializer
575 match VersionedData::load_json("temp_schema_tests/v1_data.json") {
576 Ok(data) => {
577 println!(" ✓ V1 data loaded successfully:");
578 println!(" Name: {}", data.name);
579 println!(" Value: {}", data.value);
580 println!(" Optional field: {:?}", data.optional_field);
581 println!(" New field: {:?}", data.new_field);
582 }
583 Err(e) => println!(" ✗ Failed to load V1 data: {}", e),
584 }
585
586 // Load v2 data with current deserializer
587 match VersionedData::load_json("temp_schema_tests/v2_data.json") {
588 Ok(data) => {
589 println!(" ✓ V2 data loaded successfully:");
590 println!(" Name: {}", data.name);
591 println!(" Value: {}", data.value);
592 println!(" Optional field: {:?}", data.optional_field);
593 println!(" New field: {:?}", data.new_field);
594 }
595 Err(e) => println!(" ✗ Failed to load V2 data: {}", e),
596 }
597
598 // Test future version rejection
599 println!("\nTesting future version handling:");
600 let future_version_json = r#"{
601 "version": 99,
602 "name": "future_data",
603 "value": 123.45,
604 "unknown_field": "should_be_ignored"
605 }"#;
606 fs::write("temp_schema_tests/future_data.json", future_version_json)?;
607
608 match VersionedData::load_json("temp_schema_tests/future_data.json") {
609 Ok(_) => println!(" ✗ Unexpected: Future version was accepted"),
610 Err(e) => println!(" ✓ Expected rejection of future version: {}", e),
611 }
612
613 // Demonstrate migration strategy
614 println!("\nDemonstrating migration strategy:");
615 println!(" Strategy: Load old format, upgrade to new format, save");
616
617 // Simulate migrating v1 data to v3 format
618 let v1_loaded = VersionedData::load_json("temp_schema_tests/v1_data.json")?;
619 let v1_upgraded = VersionedData {
620 version: 3,
621 name: v1_loaded.name,
622 value: v1_loaded.value,
623 optional_field: Some("migrated_default".to_string()),
624 new_field: Some(0),
625 };
626
627 v1_upgraded.save_json("temp_schema_tests/v1_migrated.json")?;
628 println!(" ✓ V1 data migrated to V3 format");
629
630 Ok(())
631}
632
633/// Demonstrate recovery strategies
634fn demonstrate_recovery_strategies() -> Result<(), Box<dyn std::error::Error>> {
635 println!("\n--- Recovery Strategies ---");
636
637 fs::create_dir_all("temp_recovery_tests")?;
638
639 // Strategy 1: Graceful degradation
640 println!("1. Graceful Degradation Strategy:");
641
642 // Create complete data
643 let complete_data = RecoverableData {
644 critical_field: "essential_info".to_string(),
645 important_field: Some("important_info".to_string()),
646 optional_field: Some("nice_to_have".to_string()),
647 metadata: {
648 let mut map = HashMap::new();
649 map.insert("key1".to_string(), "value1".to_string());
650 map.insert("key2".to_string(), "value2".to_string());
651 map
652 },
653 };
654
655 // Save complete data
656 complete_data.save_json("temp_recovery_tests/complete.json")?;
657
658 // Create partial data (missing some fields)
659 let partial_json = r#"{
660 "critical_field": "essential_info",
661 "optional_field": "nice_to_have"
662 }"#;
663 fs::write("temp_recovery_tests/partial.json", partial_json)?;
664
665 // Load partial data and demonstrate recovery
666 match RecoverableData::load_json("temp_recovery_tests/partial.json") {
667 Ok(recovered) => {
668 println!(" ✓ Partial data recovered successfully:");
669 println!(" Critical field: {}", recovered.critical_field);
670 println!(
671 " Important field: {:?} (missing, set to None)",
672 recovered.important_field
673 );
674 println!(" Optional field: {:?}", recovered.optional_field);
675 println!(
676 " Metadata: {} entries (defaulted to empty)",
677 recovered.metadata.len()
678 );
679 }
680 Err(e) => println!(" ✗ Recovery failed: {}", e),
681 }
682
683 // Strategy 2: Error context preservation
684 println!("\n2. Error Context Preservation:");
685
686 let malformed_json = r#"{
687 "critical_field": "essential_info",
688 "important_field": 12345,
689 "metadata": "not_a_map"
690 }"#;
691 fs::write("temp_recovery_tests/malformed.json", malformed_json)?;
692
693 match RecoverableData::load_json("temp_recovery_tests/malformed.json") {
694 Ok(_) => println!(" ✗ Unexpected: Malformed data was accepted"),
695 Err(e) => {
696 println!(" ✓ Error context preserved:");
697 println!(" Error: {}", e);
698 println!(" Error type: {:?}", std::mem::discriminant(&e));
699 }
700 }
701
702 // Strategy 3: Fallback data sources
703 println!("\n3. Fallback Data Sources:");
704
705 // Primary source (corrupted)
706 let corrupted_primary = "corrupted data";
707 fs::write("temp_recovery_tests/primary.json", corrupted_primary)?;
708
709 // Backup source (valid)
710 let backup_data = RecoverableData {
711 critical_field: "backup_critical".to_string(),
712 important_field: Some("backup_important".to_string()),
713 optional_field: None,
714 metadata: HashMap::new(),
715 };
716 backup_data.save_json("temp_recovery_tests/backup.json")?;
717
718 // Default fallback
719 let default_data = RecoverableData {
720 critical_field: "default_critical".to_string(),
721 important_field: None,
722 optional_field: None,
723 metadata: HashMap::new(),
724 };
725
726 println!(" Attempting to load data with fallback chain:");
727
728 // Try primary source
729 let loaded_data = match RecoverableData::load_json("temp_recovery_tests/primary.json") {
730 Ok(data) => {
731 println!(" ✓ Loaded from primary source");
732 data
733 }
734 Err(_) => {
735 println!(" ✗ Primary source failed, trying backup");
736
737 // Try backup source
738 match RecoverableData::load_json("temp_recovery_tests/backup.json") {
739 Ok(data) => {
740 println!(" ✓ Loaded from backup source");
741 data
742 }
743 Err(_) => {
744 println!(" ✗ Backup source failed, using default");
745 default_data
746 }
747 }
748 }
749 };
750
751 println!(" Final loaded data:");
752 println!(" Critical field: {}", loaded_data.critical_field);
753
754 Ok(())
755}
756
757/// Demonstrate production-ready error handling
758fn demonstrate_production_error_handling() -> Result<(), Box<dyn std::error::Error>> {
759 println!("\n--- Production Error Handling ---");
760
761 fs::create_dir_all("temp_production_tests")?;
762
763 // Error logging and monitoring
764 println!("1. Error Logging and Monitoring:");
765
766 let test_data = VersionedData {
767 version: 2,
768 name: "production_test".to_string(),
769 value: 42.0,
770 optional_field: Some("test".to_string()),
771 new_field: None,
772 };
773
774 // Create error log for demonstration
775 let mut error_log = fs::OpenOptions::new()
776 .create(true)
777 .append(true)
778 .open("temp_production_tests/error.log")?;
779
780 // Function to log errors in production format
781 let mut log_error =
782 |error: &SerializationError, context: &str| -> Result<(), Box<dyn std::error::Error>> {
783 let timestamp = std::time::SystemTime::now()
784 .duration_since(std::time::UNIX_EPOCH)?
785 .as_secs();
786
787 writeln!(error_log, "[{}] ERROR in {}: {}", timestamp, context, error)?;
788 Ok(())
789 };
790
791 // Simulate various error scenarios with logging
792 let error_scenarios = vec![
793 ("corrupted_file.json", "invalid json content"),
794 ("missing_fields.json", r#"{"version": 1}"#),
795 (
796 "type_error.json",
797 r#"{"version": "not_number", "name": "test", "value": 42.0}"#,
798 ),
799 ];
800
801 for (filename, content) in error_scenarios {
802 let filepath = format!("temp_production_tests/{}", filename);
803 fs::write(&filepath, content)?;
804
805 match VersionedData::load_json(&filepath) {
806 Ok(_) => println!(" ✗ Unexpected success for {}", filename),
807 Err(e) => {
808 log_error(&e, &format!("load_config({})", filename))?;
809 println!(" ✓ Error logged for {}: {}", filename, e);
810 }
811 }
812 }
813
814 // Health check pattern
815 println!("\n2. Health Check Pattern:");
816
817 let health_check = || -> Result<bool, SerializationError> {
818 // Check if we can serialize/deserialize basic data
819 let test_data = VersionedData {
820 version: 1,
821 name: "health_check".to_string(),
822 value: 1.0,
823 optional_field: None,
824 new_field: None,
825 };
826
827 let serialized = test_data.to_json()?;
828 let _deserialized = VersionedData::from_json(&serialized)?;
829 Ok(true)
830 };
831
832 match health_check() {
833 Ok(_) => println!(" ✓ Serialization system health check passed"),
834 Err(e) => {
835 log_error(&e, "health_check")?;
836 println!(" ✗ Serialization system health check failed: {}", e);
837 }
838 }
839
840 // Circuit breaker pattern simulation
841 println!("\n3. Circuit Breaker Pattern:");
842
843 struct CircuitBreaker {
844 failure_count: u32,
845 failure_threshold: u32,
846 is_open: bool,
847 }
848
849 impl CircuitBreaker {
850 fn new(threshold: u32) -> Self {
851 Self {
852 failure_count: 0,
853 failure_threshold: threshold,
854 is_open: false,
855 }
856 }
857
858 fn call<F, T>(&mut self, operation: F) -> Result<T, String>
859 where
860 F: FnOnce() -> Result<T, SerializationError>,
861 {
862 if self.is_open {
863 return Err("Circuit breaker is open".to_string());
864 }
865
866 match operation() {
867 Ok(result) => {
868 self.failure_count = 0; // Reset on success
869 Ok(result)
870 }
871 Err(e) => {
872 self.failure_count += 1;
873 if self.failure_count >= self.failure_threshold {
874 self.is_open = true;
875 println!(
876 " Circuit breaker opened after {} failures",
877 self.failure_count
878 );
879 }
880 Err(e.to_string())
881 }
882 }
883 }
884 }
885
886 let mut circuit_breaker = CircuitBreaker::new(3);
887
888 // Simulate operations that fail
889 for i in 1..=5 {
890 let result = circuit_breaker
891 .call(|| VersionedData::load_json("temp_production_tests/corrupted_file.json"));
892
893 match result {
894 Ok(_) => println!(" Operation {} succeeded", i),
895 Err(e) => println!(" Operation {} failed: {}", i, e),
896 }
897 }
898
899 // Retry mechanism
900 println!("\n4. Retry Mechanism:");
901
902 let retry_operation = |max_attempts: u32| -> Result<VersionedData, String> {
903 for attempt in 1..=max_attempts {
904 println!(" Attempt {}/{}", attempt, max_attempts);
905
906 // Try different sources in order
907 let sources = vec![
908 "temp_production_tests/corrupted_file.json",
909 "temp_production_tests/missing_fields.json",
910 "temp_production_tests/backup_valid.json",
911 ];
912
913 if attempt == max_attempts {
914 // On final attempt, create valid backup
915 test_data
916 .save_json("temp_production_tests/backup_valid.json")
917 .map_err(|e| format!("Failed to create backup: {}", e))?;
918 }
919
920 for source in &sources {
921 match VersionedData::load_json(source) {
922 Ok(data) => {
923 println!(" ✓ Succeeded loading from {}", source);
924 return Ok(data);
925 }
926 Err(_) => {
927 println!(" ✗ Failed to load from {}", source);
928 continue;
929 }
930 }
931 }
932
933 if attempt < max_attempts {
934 println!(" Waiting before retry...");
935 // In real code, would sleep here
936 }
937 }
938
939 Err("All retry attempts exhausted".to_string())
940 };
941
942 match retry_operation(3) {
943 Ok(data) => println!(" ✓ Retry succeeded: {}", data.name),
944 Err(e) => println!(" ✗ Retry failed: {}", e),
945 }
946
947 Ok(())
948}87 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
88 // Try JSON object first
89 if let Ok(json_data) = value.as_json_object() {
90 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
91 field: field_name.to_string(),
92 message: format!("Failed to deserialize PerformanceMetrics from JSON: {}", e),
93 });
94 }
95
96 // Try binary object
97 if let Ok(binary_data) = value.as_binary_object() {
98 return Self::from_binary(binary_data).map_err(|e| {
99 SerializationError::ValidationFailed {
100 field: field_name.to_string(),
101 message: format!(
102 "Failed to deserialize PerformanceMetrics from binary: {}",
103 e
104 ),
105 }
106 });
107 }
108
109 Err(SerializationError::ValidationFailed {
110 field: field_name.to_string(),
111 message: format!(
112 "Expected JsonObject or BinaryObject for PerformanceMetrics, found {}",
113 value.type_name()
114 ),
115 })
116 }
117}
118
119/// Large dataset for performance testing
120#[derive(Debug, Clone, PartialEq)]
121pub struct LargeDataset {
122 pub name: String,
123 pub values: Vec<f32>, // Changed from f64 to f32 (supported)
124 pub labels: Vec<String>,
125 pub feature_count: usize, // Simplified from Vec<Vec<f32>> to just a count
126 pub feature_dimension: usize, // Store dimensions separately
127 pub metadata: HashMap<String, String>,
128 pub timestamp_count: usize, // Simplified from Vec<u64> to just count
129}
130
131impl StructSerializable for LargeDataset {
132 fn to_serializer(&self) -> StructSerializer {
133 StructSerializer::new()
134 .field("name", &self.name)
135 .field("values", &self.values)
136 .field("labels", &self.labels)
137 .field("feature_count", &self.feature_count)
138 .field("feature_dimension", &self.feature_dimension)
139 .field("metadata", &self.metadata)
140 .field("timestamp_count", &self.timestamp_count)
141 }
142
143 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
144 let name = deserializer.field("name")?;
145 let values = deserializer.field("values")?;
146 let labels = deserializer.field("labels")?;
147 let feature_count = deserializer.field("feature_count")?;
148 let feature_dimension = deserializer.field("feature_dimension")?;
149 let metadata = deserializer.field("metadata")?;
150 let timestamp_count = deserializer.field("timestamp_count")?;
151
152 Ok(LargeDataset {
153 name,
154 values,
155 labels,
156 feature_count,
157 feature_dimension,
158 metadata,
159 timestamp_count,
160 })
161 }
162}
163
164impl ToFieldValue for LargeDataset {
165 fn to_field_value(&self) -> FieldValue {
166 match self.to_json() {
167 Ok(json_str) => FieldValue::from_json_object(json_str),
168 Err(_) => FieldValue::from_string("serialization_error".to_string()),
169 }
170 }
171}
172
173impl FromFieldValue for LargeDataset {
174 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
175 // Try JSON object first
176 if let Ok(json_data) = value.as_json_object() {
177 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
178 field: field_name.to_string(),
179 message: format!("Failed to deserialize LargeDataset from JSON: {}", e),
180 });
181 }
182
183 // Try binary object
184 if let Ok(binary_data) = value.as_binary_object() {
185 return Self::from_binary(binary_data).map_err(|e| {
186 SerializationError::ValidationFailed {
187 field: field_name.to_string(),
188 message: format!("Failed to deserialize LargeDataset from binary: {}", e),
189 }
190 });
191 }
192
193 Err(SerializationError::ValidationFailed {
194 field: field_name.to_string(),
195 message: format!(
196 "Expected JsonObject or BinaryObject for LargeDataset, found {}",
197 value.type_name()
198 ),
199 })
200 }
201}
202
203/// Configuration data (typical JSON use case)
204#[derive(Debug, Clone, PartialEq)]
205pub struct Configuration {
206 pub version: String,
207 pub debug_enabled: bool,
208 pub log_level: String,
209 pub database_settings: HashMap<String, String>,
210 pub feature_flags_enabled: bool, // Simplified from HashMap<String, bool>
211 pub max_connections: f32, // Simplified from HashMap<String, f64>
212 pub timeout_seconds: f32,
213}
214
215impl StructSerializable for Configuration {
216 fn to_serializer(&self) -> StructSerializer {
217 StructSerializer::new()
218 .field("version", &self.version)
219 .field("debug_enabled", &self.debug_enabled)
220 .field("log_level", &self.log_level)
221 .field("database_settings", &self.database_settings)
222 .field("feature_flags_enabled", &self.feature_flags_enabled)
223 .field("max_connections", &self.max_connections)
224 .field("timeout_seconds", &self.timeout_seconds)
225 }
226
227 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
228 let version = deserializer.field("version")?;
229 let debug_enabled = deserializer.field("debug_enabled")?;
230 let log_level = deserializer.field("log_level")?;
231 let database_settings = deserializer.field("database_settings")?;
232 let feature_flags_enabled = deserializer.field("feature_flags_enabled")?;
233 let max_connections = deserializer.field("max_connections")?;
234 let timeout_seconds = deserializer.field("timeout_seconds")?;
235
236 Ok(Configuration {
237 version,
238 debug_enabled,
239 log_level,
240 database_settings,
241 feature_flags_enabled,
242 max_connections,
243 timeout_seconds,
244 })
245 }
246}
247
248impl ToFieldValue for Configuration {
249 fn to_field_value(&self) -> FieldValue {
250 match self.to_json() {
251 Ok(json_str) => FieldValue::from_json_object(json_str),
252 Err(_) => FieldValue::from_string("serialization_error".to_string()),
253 }
254 }
255}
256
257impl FromFieldValue for Configuration {
258 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
259 // Try JSON object first
260 if let Ok(json_data) = value.as_json_object() {
261 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
262 field: field_name.to_string(),
263 message: format!("Failed to deserialize Configuration from JSON: {}", e),
264 });
265 }
266
267 // Try binary object
268 if let Ok(binary_data) = value.as_binary_object() {
269 return Self::from_binary(binary_data).map_err(|e| {
270 SerializationError::ValidationFailed {
271 field: field_name.to_string(),
272 message: format!("Failed to deserialize Configuration from binary: {}", e),
273 }
274 });
275 }
276
277 Err(SerializationError::ValidationFailed {
278 field: field_name.to_string(),
279 message: format!(
280 "Expected JsonObject or BinaryObject for Configuration, found {}",
281 value.type_name()
282 ),
283 })
284 }
285}
286
287/// Format comparison results
288#[derive(Debug)]
289pub struct FormatComparison {
290 pub data_type: String,
291 pub json_size_bytes: u64,
292 pub binary_size_bytes: u64,
293 pub json_serialize_micros: u64,
294 pub binary_serialize_micros: u64,
295 pub json_deserialize_micros: u64,
296 pub binary_deserialize_micros: u64,
297 pub size_ratio: f64,
298 pub serialize_speed_ratio: f64,
299 pub deserialize_speed_ratio: f64,
300}
301
302impl FormatComparison {
303 fn new(data_type: String) -> Self {
304 Self {
305 data_type,
306 json_size_bytes: 0,
307 binary_size_bytes: 0,
308 json_serialize_micros: 0,
309 binary_serialize_micros: 0,
310 json_deserialize_micros: 0,
311 binary_deserialize_micros: 0,
312 size_ratio: 0.0,
313 serialize_speed_ratio: 0.0,
314 deserialize_speed_ratio: 0.0,
315 }
316 }
317
318 fn calculate_ratios(&mut self) {
319 self.size_ratio = self.json_size_bytes as f64 / self.binary_size_bytes as f64;
320 self.serialize_speed_ratio =
321 self.binary_serialize_micros as f64 / self.json_serialize_micros as f64;
322 self.deserialize_speed_ratio =
323 self.binary_deserialize_micros as f64 / self.json_deserialize_micros as f64;
324 }
325}
326
327fn main() -> Result<(), Box<dyn std::error::Error>> {
328 println!("=== JSON vs Binary Format Comparison Example ===\n");
329
330 demonstrate_format_characteristics()?;
331 demonstrate_size_comparisons()?;
332 demonstrate_performance_benchmarks()?;
333 demonstrate_use_case_recommendations()?;
334 demonstrate_debugging_capabilities()?;
335 cleanup_temp_files()?;
336
337 println!("\n=== Example completed successfully! ===");
338 Ok(())
339}
340
341/// Demonstrate basic format characteristics
342fn demonstrate_format_characteristics() -> Result<(), Box<dyn std::error::Error>> {
343 println!("--- Format Characteristics ---");
344
345 // Create sample data structures
346 let mut metadata = HashMap::new();
347 metadata.insert("operation_type".to_string(), "benchmark".to_string());
348 metadata.insert("system".to_string(), "train_station".to_string());
349
350 let metrics = PerformanceMetrics {
351 operation: "tensor_multiplication".to_string(),
352 duration_micros: 1234,
353 memory_usage_bytes: 8192,
354 cpu_usage_percent: 75.5,
355 throughput_ops_per_sec: 1000.0,
356 metadata,
357 };
358
359 println!("Format characteristics analysis:");
360
361 // JSON characteristics
362 let json_data = metrics.to_json()?;
363 let json_lines = json_data.lines().count();
364 let json_chars = json_data.chars().count();
365
366 println!("\nJSON Format:");
367 println!(" Size: {} bytes", json_data.len());
368 println!(" Characters: {}", json_chars);
369 println!(" Lines: {}", json_lines);
370 println!(" Human readable: Yes");
371 println!(" Self-describing: Yes");
372 println!(" Cross-platform: Yes");
373 println!(" Compression ratio: Variable (depends on content)");
374
375 // Show sample JSON output
376 println!(" Sample output:");
377 for line in json_data.lines().take(3) {
378 println!(" {}", line);
379 }
380 if json_lines > 3 {
381 println!(" ... ({} more lines)", json_lines - 3);
382 }
383
384 // Binary characteristics
385 let binary_data = metrics.to_binary()?;
386
387 println!("\nBinary Format:");
388 println!(" Size: {} bytes", binary_data.len());
389 println!(" Human readable: No");
390 println!(" Self-describing: No (requires schema)");
391 println!(" Cross-platform: Yes (with proper endianness handling)");
392 println!(" Compression ratio: High (efficient encoding)");
393
394 // Show sample binary output (hex)
395 println!(" Sample output (first 32 bytes as hex):");
396 print!(" ");
397 for (i, byte) in binary_data.iter().take(32).enumerate() {
398 if i > 0 && i % 16 == 0 {
399 println!();
400 print!(" ");
401 }
402 print!("{:02x} ", byte);
403 }
404 if binary_data.len() > 32 {
405 println!("\n ... ({} more bytes)", binary_data.len() - 32);
406 } else {
407 println!();
408 }
409
410 // Verify roundtrip for both formats
411 let json_parsed = PerformanceMetrics::from_json(&json_data)?;
412 let binary_parsed = PerformanceMetrics::from_binary(&binary_data)?;
413
414 assert_eq!(metrics, json_parsed);
415 assert_eq!(metrics, binary_parsed);
416 println!("\nRoundtrip verification: PASSED");
417
418 Ok(())
419}
420
421/// Demonstrate size comparisons across different data types
422fn demonstrate_size_comparisons() -> Result<(), Box<dyn std::error::Error>> {
423 println!("\n--- Size Comparison Analysis ---");
424
425 // Test 1: Small configuration data (typical JSON use case)
426 let mut db_settings = HashMap::new();
427 db_settings.insert("host".to_string(), "localhost".to_string());
428 db_settings.insert("port".to_string(), "5432".to_string());
429 db_settings.insert("database".to_string(), "myapp".to_string());
430
431 let config = Configuration {
432 version: "1.2.3".to_string(),
433 debug_enabled: true,
434 log_level: "info".to_string(),
435 database_settings: db_settings,
436 feature_flags_enabled: true,
437 max_connections: 100.0,
438 timeout_seconds: 30.0,
439 };
440
441 // Test 2: Large numeric dataset (typical binary use case)
442 let large_dataset = LargeDataset {
443 name: "ML Training Data".to_string(),
444 values: (0..1000).map(|i| i as f32 * 0.1).collect(),
445 labels: (0..1000).map(|i| format!("label_{}", i)).collect(),
446 feature_count: 100,
447 feature_dimension: 50,
448 timestamp_count: 1000,
449 metadata: HashMap::new(),
450 };
451
452 println!("Size comparison results:");
453
454 // Configuration comparison
455 let config_json = config.to_json()?;
456 let config_binary = config.to_binary()?;
457
458 println!("\nConfiguration Data (small, text-heavy):");
459 println!(" JSON: {} bytes", config_json.len());
460 println!(" Binary: {} bytes", config_binary.len());
461 println!(
462 " Ratio (JSON/Binary): {:.2}x",
463 config_json.len() as f64 / config_binary.len() as f64
464 );
465 println!(" Recommendation: JSON (human readable, small size difference)");
466
467 // Large dataset comparison
468 let dataset_json = large_dataset.to_json()?;
469 let dataset_binary = large_dataset.to_binary()?;
470
471 println!("\nLarge Numeric Dataset (1000 values, 100x50 matrix):");
472 println!(
473 " JSON: {} bytes ({:.1} KB)",
474 dataset_json.len(),
475 dataset_json.len() as f64 / 1024.0
476 );
477 println!(
478 " Binary: {} bytes ({:.1} KB)",
479 dataset_binary.len(),
480 dataset_binary.len() as f64 / 1024.0
481 );
482 println!(
483 " Ratio (JSON/Binary): {:.2}x",
484 dataset_json.len() as f64 / dataset_binary.len() as f64
485 );
486 if dataset_json.len() > dataset_binary.len() {
487 println!(
488 " Space saved with binary: {} bytes ({:.1} KB)",
489 dataset_json.len() - dataset_binary.len(),
490 (dataset_json.len() - dataset_binary.len()) as f64 / 1024.0
491 );
492 println!(" Recommendation: Binary (significant size reduction)");
493 } else {
494 println!(
495 " Binary overhead: {} bytes ({:.1} KB)",
496 dataset_binary.len() - dataset_json.len(),
497 (dataset_binary.len() - dataset_json.len()) as f64 / 1024.0
498 );
499 println!(" Recommendation: JSON (binary overhead not justified for this size)");
500 }
501
502 // Content analysis
503 println!("\nContent Type Analysis:");
504
505 // Analyze JSON content patterns
506 let json_numbers = dataset_json.matches(char::is_numeric).count();
507 let json_brackets = dataset_json.matches('[').count() + dataset_json.matches(']').count();
508 let json_quotes = dataset_json.matches('"').count();
509
510 println!(" JSON overhead sources:");
511 println!(" Numeric characters: ~{}", json_numbers);
512 println!(" Brackets and commas: ~{}", json_brackets);
513 println!(" Quote marks: {}", json_quotes);
514 println!(" Formatting/whitespace: Varies");
515
516 println!(" Binary advantages:");
517 println!(" Direct numeric encoding: 4-8 bytes per number");
518 println!(" No formatting overhead: Zero bytes");
519 println!(" Efficient length encoding: Minimal bytes");
520
521 Ok(())
522}
523
524/// Demonstrate performance benchmarks
525fn demonstrate_performance_benchmarks() -> Result<(), Box<dyn std::error::Error>> {
526 println!("\n--- Performance Benchmark Analysis ---");
527
528 // Create test data of varying sizes
529 let small_config = Configuration {
530 version: "1.0.0".to_string(),
531 debug_enabled: false,
532 log_level: "warn".to_string(),
533 database_settings: HashMap::new(),
534 feature_flags_enabled: false,
535 max_connections: 100.0,
536 timeout_seconds: 30.0,
537 };
538
539 let large_dataset = LargeDataset {
540 name: "Large Dataset".to_string(),
541 values: (0..5000).map(|i| i as f32 * 0.001).collect(),
542 labels: (0..5000).map(|i| format!("large_item_{}", i)).collect(),
543 feature_count: 200,
544 feature_dimension: 25,
545 timestamp_count: 5000,
546 metadata: HashMap::new(),
547 };
548
549 println!("Performance benchmark results:");
550
551 // Benchmark each dataset (avoiding trait objects due to object safety)
552 let dataset_names = ["Small Config", "Large Dataset"];
553
554 for (i, name) in dataset_names.iter().enumerate() {
555 let mut comparison = FormatComparison::new(name.to_string());
556
557 // JSON serialization benchmark
558 let start = Instant::now();
559 let json_data = match i {
560 0 => small_config.to_json()?,
561 _ => large_dataset.to_json()?,
562 };
563 comparison.json_serialize_micros = start.elapsed().as_micros() as u64;
564 comparison.json_size_bytes = json_data.len() as u64;
565
566 // JSON deserialization benchmark (using PerformanceMetrics as example)
567 if *name == "Small Config" {
568 let start = Instant::now();
569 let _parsed = Configuration::from_json(&json_data)?;
570 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
571 } else {
572 let start = Instant::now();
573 let _parsed = LargeDataset::from_json(&json_data)?;
574 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
575 }
576
577 // Binary serialization benchmark
578 let start = Instant::now();
579 let binary_data = match i {
580 0 => small_config.to_binary()?,
581 _ => large_dataset.to_binary()?,
582 };
583 comparison.binary_serialize_micros = start.elapsed().as_micros() as u64;
584 comparison.binary_size_bytes = binary_data.len() as u64;
585
586 // Binary deserialization benchmark
587 if *name == "Small Config" {
588 let start = Instant::now();
589 let _parsed = Configuration::from_binary(&binary_data)?;
590 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
591 } else {
592 let start = Instant::now();
593 let _parsed = LargeDataset::from_binary(&binary_data)?;
594 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
595 }
596
597 // Calculate ratios
598 comparison.calculate_ratios();
599
600 // Display results
601 println!("\n{}:", name);
602 println!(
603 " Size - JSON: {} bytes, Binary: {} bytes (ratio: {:.2}x)",
604 comparison.json_size_bytes, comparison.binary_size_bytes, comparison.size_ratio
605 );
606 println!(
607 " Serialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
608 comparison.json_serialize_micros,
609 comparison.binary_serialize_micros,
610 comparison.serialize_speed_ratio
611 );
612 println!(
613 " Deserialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
614 comparison.json_deserialize_micros,
615 comparison.binary_deserialize_micros,
616 comparison.deserialize_speed_ratio
617 );
618 }
619
620 println!("\nPerformance Summary:");
621 println!(" - Binary format consistently uses less storage space");
622 println!(" - Performance differences vary by data type and size");
623 println!(" - Larger datasets show more significant binary advantages");
624 println!(" - JSON parsing overhead increases with structure complexity");
625
626 Ok(())
627}Sourcefn from_binary(data: &[u8]) -> SerializationResult<Self>
fn from_binary(data: &[u8]) -> SerializationResult<Self>
Creates the struct from binary data
Deserializes binary data into a new instance of the struct. The binary data should contain all required fields in the expected format.
§Arguments
data- Binary data containing the struct data
§Returns
Ok(Self) on successful deserialization
Err(SerializationError) if binary parsing or deserialization fails
§Examples
Creates struct from binary data with proper parsing and validation.
Examples found in repository?
91 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
92 // Try JSON object first
93 if let Ok(json_data) = value.as_json_object() {
94 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
95 field: field_name.to_string(),
96 message: format!("Failed to deserialize UserProfile from JSON: {}", e),
97 });
98 }
99
100 // Try binary object
101 if let Ok(binary_data) = value.as_binary_object() {
102 return Self::from_binary(binary_data).map_err(|e| {
103 SerializationError::ValidationFailed {
104 field: field_name.to_string(),
105 message: format!("Failed to deserialize UserProfile from binary: {}", e),
106 }
107 });
108 }
109
110 Err(SerializationError::ValidationFailed {
111 field: field_name.to_string(),
112 message: format!(
113 "Expected JsonObject or BinaryObject for UserProfile, found {}",
114 value.type_name()
115 ),
116 })
117 }
118}
119
120/// Application settings struct with optional fields and collections
121#[derive(Debug, Clone, PartialEq)]
122pub struct AppSettings {
123 pub app_name: String,
124 pub version: String,
125 pub debug_mode: bool,
126 pub max_connections: u32,
127 pub timeout_seconds: f32,
128 pub features: Vec<String>,
129 pub environment_vars: HashMap<String, String>,
130 pub optional_database_url: Option<String>,
131}
132
133impl StructSerializable for AppSettings {
134 fn to_serializer(&self) -> StructSerializer {
135 StructSerializer::new()
136 .field("app_name", &self.app_name)
137 .field("version", &self.version)
138 .field("debug_mode", &self.debug_mode)
139 .field("max_connections", &self.max_connections)
140 .field("timeout_seconds", &self.timeout_seconds)
141 .field("features", &self.features)
142 .field("environment_vars", &self.environment_vars)
143 .field("optional_database_url", &self.optional_database_url)
144 }
145
146 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
147 let app_name = deserializer.field("app_name")?;
148 let version = deserializer.field("version")?;
149 let debug_mode = deserializer.field("debug_mode")?;
150 let max_connections = deserializer.field("max_connections")?;
151 let timeout_seconds = deserializer.field("timeout_seconds")?;
152 let features = deserializer.field("features")?;
153 let environment_vars = deserializer.field("environment_vars")?;
154 let optional_database_url = deserializer.field("optional_database_url")?;
155
156 Ok(AppSettings {
157 app_name,
158 version,
159 debug_mode,
160 max_connections,
161 timeout_seconds,
162 features,
163 environment_vars,
164 optional_database_url,
165 })
166 }
167}
168
169impl ToFieldValue for AppSettings {
170 fn to_field_value(&self) -> FieldValue {
171 // Convert to JSON and then parse as FieldValue for nested object handling
172 match self.to_json() {
173 Ok(json_str) => FieldValue::from_json_object(json_str),
174 Err(_) => FieldValue::from_string("serialization_error".to_string()),
175 }
176 }
177}
178
179impl FromFieldValue for AppSettings {
180 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
181 // Try JSON object first
182 if let Ok(json_data) = value.as_json_object() {
183 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
184 field: field_name.to_string(),
185 message: format!("Failed to deserialize AppSettings from JSON: {}", e),
186 });
187 }
188
189 // Try binary object
190 if let Ok(binary_data) = value.as_binary_object() {
191 return Self::from_binary(binary_data).map_err(|e| {
192 SerializationError::ValidationFailed {
193 field: field_name.to_string(),
194 message: format!("Failed to deserialize AppSettings from binary: {}", e),
195 }
196 });
197 }
198
199 Err(SerializationError::ValidationFailed {
200 field: field_name.to_string(),
201 message: format!(
202 "Expected JsonObject or BinaryObject for AppSettings, found {}",
203 value.type_name()
204 ),
205 })
206 }
207}
208
209fn main() -> Result<(), Box<dyn std::error::Error>> {
210 println!("=== Basic Struct Serialization Example ===\n");
211
212 demonstrate_user_profile_serialization()?;
213 demonstrate_app_settings_serialization()?;
214 demonstrate_format_comparison()?;
215 demonstrate_roundtrip_verification()?;
216 demonstrate_field_access_patterns()?;
217 cleanup_temp_files()?;
218
219 println!("\n=== Example completed successfully! ===");
220 Ok(())
221}
222
223/// Demonstrate basic struct serialization with simple field types
224fn demonstrate_user_profile_serialization() -> Result<(), Box<dyn std::error::Error>> {
225 println!("--- User Profile Serialization ---");
226
227 // Create a user profile with various field types
228 let user = UserProfile {
229 id: 12345,
230 username: "alice_cooper".to_string(),
231 email: "alice@example.com".to_string(),
232 age: 28,
233 is_active: true,
234 score: 95.7,
235 };
236
237 println!("Original user profile:");
238 println!(" ID: {}", user.id);
239 println!(" Username: {}", user.username);
240 println!(" Email: {}", user.email);
241 println!(" Age: {}", user.age);
242 println!(" Active: {}", user.is_active);
243 println!(" Score: {}", user.score);
244
245 // Serialize to JSON
246 let json_data = user.to_json()?;
247 println!("\nSerialized to JSON:");
248 println!("{}", json_data);
249
250 // Save to JSON file
251 user.save_json("temp_user_profile.json")?;
252 println!("Saved to file: temp_user_profile.json");
253
254 // Load from JSON file
255 let loaded_user = UserProfile::load_json("temp_user_profile.json")?;
256 println!("\nLoaded user profile:");
257 println!(" ID: {}", loaded_user.id);
258 println!(" Username: {}", loaded_user.username);
259 println!(" Email: {}", loaded_user.email);
260 println!(" Age: {}", loaded_user.age);
261 println!(" Active: {}", loaded_user.is_active);
262 println!(" Score: {}", loaded_user.score);
263
264 // Verify data integrity
265 assert_eq!(user, loaded_user);
266 println!("Data integrity verification: PASSED");
267
268 Ok(())
269}
270
271/// Demonstrate serialization with collections and optional fields
272fn demonstrate_app_settings_serialization() -> Result<(), Box<dyn std::error::Error>> {
273 println!("\n--- App Settings Serialization ---");
274
275 // Create app settings with collections and optional fields
276 let mut env_vars = HashMap::new();
277 env_vars.insert("LOG_LEVEL".to_string(), "info".to_string());
278 env_vars.insert("PORT".to_string(), "8080".to_string());
279 env_vars.insert("HOST".to_string(), "localhost".to_string());
280
281 let settings = AppSettings {
282 app_name: "Train Station Example".to_string(),
283 version: "1.0.0".to_string(),
284 debug_mode: true,
285 max_connections: 100,
286 timeout_seconds: 30.5,
287 features: vec![
288 "authentication".to_string(),
289 "logging".to_string(),
290 "metrics".to_string(),
291 ],
292 environment_vars: env_vars,
293 optional_database_url: Some("postgresql://localhost:5432/mydb".to_string()),
294 };
295
296 println!("Original app settings:");
297 println!(" App Name: {}", settings.app_name);
298 println!(" Version: {}", settings.version);
299 println!(" Debug Mode: {}", settings.debug_mode);
300 println!(" Max Connections: {}", settings.max_connections);
301 println!(" Timeout: {} seconds", settings.timeout_seconds);
302 println!(" Features: {:?}", settings.features);
303 println!(" Environment Variables: {:?}", settings.environment_vars);
304 println!(" Database URL: {:?}", settings.optional_database_url);
305
306 // Serialize to binary format for efficient storage
307 let binary_data = settings.to_binary()?;
308 println!("\nSerialized to binary: {} bytes", binary_data.len());
309
310 // Save to binary file
311 settings.save_binary("temp_app_settings.bin")?;
312 println!("Saved to file: temp_app_settings.bin");
313
314 // Load from binary file
315 let loaded_settings = AppSettings::load_binary("temp_app_settings.bin")?;
316 println!("\nLoaded app settings:");
317 println!(" App Name: {}", loaded_settings.app_name);
318 println!(" Version: {}", loaded_settings.version);
319 println!(" Debug Mode: {}", loaded_settings.debug_mode);
320 println!(" Features count: {}", loaded_settings.features.len());
321 println!(
322 " Environment variables count: {}",
323 loaded_settings.environment_vars.len()
324 );
325
326 // Verify data integrity
327 assert_eq!(settings, loaded_settings);
328 println!("Data integrity verification: PASSED");
329
330 Ok(())
331}
332
333/// Demonstrate format comparison between JSON and binary
334fn demonstrate_format_comparison() -> Result<(), Box<dyn std::error::Error>> {
335 println!("\n--- Format Comparison ---");
336
337 let user = UserProfile {
338 id: 98765,
339 username: "bob_builder".to_string(),
340 email: "bob@construction.com".to_string(),
341 age: 35,
342 is_active: false,
343 score: 87.2,
344 };
345
346 // Save in both formats
347 user.save_json("temp_format_comparison.json")?;
348 user.save_binary("temp_format_comparison.bin")?;
349
350 // Compare file sizes
351 let json_size = fs::metadata("temp_format_comparison.json")?.len();
352 let binary_size = fs::metadata("temp_format_comparison.bin")?.len();
353
354 println!("Format comparison for UserProfile:");
355 println!(" JSON file size: {} bytes", json_size);
356 println!(" Binary file size: {} bytes", binary_size);
357 println!(
358 " Size ratio (JSON/Binary): {:.2}x",
359 json_size as f64 / binary_size as f64
360 );
361
362 // Demonstrate readability
363 let json_content = fs::read_to_string("temp_format_comparison.json")?;
364 println!("\nJSON format (human-readable):");
365 println!("{}", json_content);
366
367 println!("\nBinary format (first 32 bytes as hex):");
368 let binary_content = fs::read("temp_format_comparison.bin")?;
369 for (i, byte) in binary_content.iter().take(32).enumerate() {
370 if i % 16 == 0 && i > 0 {
371 println!();
372 }
373 print!("{:02x} ", byte);
374 }
375 println!("\n... ({} total bytes)", binary_content.len());
376
377 // Load and verify both formats produce identical results
378 let json_loaded = UserProfile::load_json("temp_format_comparison.json")?;
379 let binary_loaded = UserProfile::load_binary("temp_format_comparison.bin")?;
380
381 assert_eq!(json_loaded, binary_loaded);
382 println!("\nFormat consistency verification: PASSED");
383
384 Ok(())
385}
386
387/// Demonstrate roundtrip verification with multiple data variations
388fn demonstrate_roundtrip_verification() -> Result<(), Box<dyn std::error::Error>> {
389 println!("\n--- Roundtrip Verification ---");
390
391 // Test various data patterns
392 let test_users = [
393 UserProfile {
394 id: 0,
395 username: "".to_string(),
396 email: "empty@test.com".to_string(),
397 age: 0,
398 is_active: false,
399 score: 0.0,
400 },
401 UserProfile {
402 id: u32::MAX,
403 username: "maximal_user_with_very_long_name_123456789".to_string(),
404 email: "test@verylongdomainname.example.org".to_string(),
405 age: i32::MAX,
406 is_active: true,
407 score: 999999.5,
408 },
409 UserProfile {
410 id: 42,
411 username: "unicode_tëst_🦀".to_string(),
412 email: "unicode@tëst.com".to_string(),
413 age: 25,
414 is_active: true,
415 score: -123.456,
416 },
417 ];
418
419 println!(
420 "Testing roundtrip serialization with {} variations:",
421 test_users.len()
422 );
423
424 for (i, user) in test_users.iter().enumerate() {
425 println!(
426 " Test case {}: ID={}, Username='{}'",
427 i + 1,
428 user.id,
429 user.username
430 );
431
432 // JSON roundtrip
433 let json_data = user.to_json()?;
434 let json_parsed = UserProfile::from_json(&json_data)?;
435 assert_eq!(*user, json_parsed);
436
437 // Binary roundtrip
438 let binary_data = user.to_binary()?;
439 let binary_parsed = UserProfile::from_binary(&binary_data)?;
440 assert_eq!(*user, binary_parsed);
441
442 println!(" JSON roundtrip: PASSED");
443 println!(" Binary roundtrip: PASSED");
444 }
445
446 println!("All roundtrip tests: PASSED");
447
448 Ok(())
449}More examples
83 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
84 // Try JSON object first
85 if let Ok(json_data) = value.as_json_object() {
86 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
87 field: field_name.to_string(),
88 message: format!("Failed to deserialize ContactInfo from JSON: {}", e),
89 });
90 }
91
92 // Try binary object
93 if let Ok(binary_data) = value.as_binary_object() {
94 return Self::from_binary(binary_data).map_err(|e| {
95 SerializationError::ValidationFailed {
96 field: field_name.to_string(),
97 message: format!("Failed to deserialize ContactInfo from binary: {}", e),
98 }
99 });
100 }
101
102 Err(SerializationError::ValidationFailed {
103 field: field_name.to_string(),
104 message: format!(
105 "Expected JsonObject or BinaryObject for ContactInfo, found {}",
106 value.type_name()
107 ),
108 })
109 }
110}
111
112/// Address struct
113#[derive(Debug, Clone, PartialEq)]
114pub struct Address {
115 pub street: String,
116 pub city: String,
117 pub state: String,
118 pub postal_code: String,
119 pub country: String,
120}
121
122impl StructSerializable for Address {
123 fn to_serializer(&self) -> StructSerializer {
124 StructSerializer::new()
125 .field("street", &self.street)
126 .field("city", &self.city)
127 .field("state", &self.state)
128 .field("postal_code", &self.postal_code)
129 .field("country", &self.country)
130 }
131
132 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
133 let street = deserializer.field("street")?;
134 let city = deserializer.field("city")?;
135 let state = deserializer.field("state")?;
136 let postal_code = deserializer.field("postal_code")?;
137 let country = deserializer.field("country")?;
138
139 Ok(Address {
140 street,
141 city,
142 state,
143 postal_code,
144 country,
145 })
146 }
147}
148
149impl ToFieldValue for Address {
150 fn to_field_value(&self) -> FieldValue {
151 match self.to_json() {
152 Ok(json_str) => FieldValue::from_json_object(json_str),
153 Err(_) => FieldValue::from_string("serialization_error".to_string()),
154 }
155 }
156}
157
158impl FromFieldValue for Address {
159 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
160 // Try JSON object first
161 if let Ok(json_data) = value.as_json_object() {
162 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
163 field: field_name.to_string(),
164 message: format!("Failed to deserialize Address from JSON: {}", e),
165 });
166 }
167
168 // Try binary object
169 if let Ok(binary_data) = value.as_binary_object() {
170 return Self::from_binary(binary_data).map_err(|e| {
171 SerializationError::ValidationFailed {
172 field: field_name.to_string(),
173 message: format!("Failed to deserialize Address from binary: {}", e),
174 }
175 });
176 }
177
178 Err(SerializationError::ValidationFailed {
179 field: field_name.to_string(),
180 message: format!(
181 "Expected JsonObject or BinaryObject for Address, found {}",
182 value.type_name()
183 ),
184 })
185 }
186}
187
188/// Project information struct
189#[derive(Debug, Clone, PartialEq)]
190pub struct Project {
191 pub name: String,
192 pub description: String,
193 pub status: ProjectStatus,
194 pub budget: f64,
195 pub team_members: Vec<String>,
196 pub milestones: Vec<Milestone>,
197 pub metadata: HashMap<String, String>,
198}
199
200impl StructSerializable for Project {
201 fn to_serializer(&self) -> StructSerializer {
202 StructSerializer::new()
203 .field("name", &self.name)
204 .field("description", &self.description)
205 .field("status", &self.status)
206 .field("budget", &self.budget)
207 .field("team_members", &self.team_members)
208 .field("milestones", &self.milestones)
209 .field("metadata", &self.metadata)
210 }
211
212 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
213 let name = deserializer.field("name")?;
214 let description = deserializer.field("description")?;
215 let status = deserializer.field("status")?;
216 let budget = deserializer.field("budget")?;
217 let team_members = deserializer.field("team_members")?;
218 let milestones = deserializer.field("milestones")?;
219 let metadata = deserializer.field("metadata")?;
220
221 Ok(Project {
222 name,
223 description,
224 status,
225 budget,
226 team_members,
227 milestones,
228 metadata,
229 })
230 }
231}
232
233impl ToFieldValue for Project {
234 fn to_field_value(&self) -> FieldValue {
235 match self.to_json() {
236 Ok(json_str) => FieldValue::from_json_object(json_str),
237 Err(_) => FieldValue::from_string("serialization_error".to_string()),
238 }
239 }
240}
241
242impl FromFieldValue for Project {
243 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
244 // Try JSON object first
245 if let Ok(json_data) = value.as_json_object() {
246 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
247 field: field_name.to_string(),
248 message: format!("Failed to deserialize Project from JSON: {}", e),
249 });
250 }
251
252 // Try binary object
253 if let Ok(binary_data) = value.as_binary_object() {
254 return Self::from_binary(binary_data).map_err(|e| {
255 SerializationError::ValidationFailed {
256 field: field_name.to_string(),
257 message: format!("Failed to deserialize Project from binary: {}", e),
258 }
259 });
260 }
261
262 Err(SerializationError::ValidationFailed {
263 field: field_name.to_string(),
264 message: format!(
265 "Expected JsonObject or BinaryObject for Project, found {}",
266 value.type_name()
267 ),
268 })
269 }
270}
271
272/// Project status enumeration
273#[derive(Debug, Clone, PartialEq)]
274pub enum ProjectStatus {
275 Planning,
276 InProgress,
277 OnHold,
278 Completed,
279 Cancelled,
280}
281
282impl ToFieldValue for ProjectStatus {
283 fn to_field_value(&self) -> FieldValue {
284 let status_str = match self {
285 ProjectStatus::Planning => "planning",
286 ProjectStatus::InProgress => "in_progress",
287 ProjectStatus::OnHold => "on_hold",
288 ProjectStatus::Completed => "completed",
289 ProjectStatus::Cancelled => "cancelled",
290 };
291 FieldValue::from_string(status_str.to_string())
292 }
293}
294
295impl FromFieldValue for ProjectStatus {
296 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
297 match value {
298 FieldValue::String(s) => match s.as_str() {
299 "planning" => Ok(ProjectStatus::Planning),
300 "in_progress" => Ok(ProjectStatus::InProgress),
301 "on_hold" => Ok(ProjectStatus::OnHold),
302 "completed" => Ok(ProjectStatus::Completed),
303 "cancelled" => Ok(ProjectStatus::Cancelled),
304 _ => Err(SerializationError::ValidationFailed {
305 field: field_name.to_string(),
306 message: format!("Unknown project status: {}", s),
307 }),
308 },
309 _ => Err(SerializationError::ValidationFailed {
310 field: field_name.to_string(),
311 message: format!(
312 "Expected String for ProjectStatus, found {}",
313 value.type_name()
314 ),
315 }),
316 }
317 }
318}
319
320/// Project milestone struct
321#[derive(Debug, Clone, PartialEq)]
322pub struct Milestone {
323 pub name: String,
324 pub description: String,
325 pub due_date: String, // Simplified as string for this example
326 pub is_completed: bool,
327 pub progress_percentage: f32,
328 pub dependencies: Vec<String>,
329}
330
331impl StructSerializable for Milestone {
332 fn to_serializer(&self) -> StructSerializer {
333 StructSerializer::new()
334 .field("name", &self.name)
335 .field("description", &self.description)
336 .field("due_date", &self.due_date)
337 .field("is_completed", &self.is_completed)
338 .field("progress_percentage", &self.progress_percentage)
339 .field("dependencies", &self.dependencies)
340 }
341
342 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
343 let name = deserializer.field("name")?;
344 let description = deserializer.field("description")?;
345 let due_date = deserializer.field("due_date")?;
346 let is_completed = deserializer.field("is_completed")?;
347 let progress_percentage = deserializer.field("progress_percentage")?;
348 let dependencies = deserializer.field("dependencies")?;
349
350 Ok(Milestone {
351 name,
352 description,
353 due_date,
354 is_completed,
355 progress_percentage,
356 dependencies,
357 })
358 }
359}
360
361impl ToFieldValue for Milestone {
362 fn to_field_value(&self) -> FieldValue {
363 match self.to_json() {
364 Ok(json_str) => FieldValue::from_json_object(json_str),
365 Err(_) => FieldValue::from_string("serialization_error".to_string()),
366 }
367 }
368}
369
370impl FromFieldValue for Milestone {
371 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
372 // Try JSON object first
373 if let Ok(json_data) = value.as_json_object() {
374 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
375 field: field_name.to_string(),
376 message: format!("Failed to deserialize Milestone from JSON: {}", e),
377 });
378 }
379
380 // Try binary object
381 if let Ok(binary_data) = value.as_binary_object() {
382 return Self::from_binary(binary_data).map_err(|e| {
383 SerializationError::ValidationFailed {
384 field: field_name.to_string(),
385 message: format!("Failed to deserialize Milestone from binary: {}", e),
386 }
387 });
388 }
389
390 Err(SerializationError::ValidationFailed {
391 field: field_name.to_string(),
392 message: format!(
393 "Expected JsonObject or BinaryObject for Milestone, found {}",
394 value.type_name()
395 ),
396 })
397 }
398}
399
400/// Company struct with basic collections and nesting
401#[derive(Debug, Clone, PartialEq)]
402pub struct Company {
403 pub name: String,
404 pub founded_year: i32,
405 pub headquarters_city: String,
406 pub headquarters_state: String,
407 pub employee_count: usize,
408 pub department_names: Vec<String>,
409 pub active_project_names: Vec<String>,
410 pub company_metadata: HashMap<String, String>,
411}
412
413impl StructSerializable for Company {
414 fn to_serializer(&self) -> StructSerializer {
415 StructSerializer::new()
416 .field("name", &self.name)
417 .field("founded_year", &self.founded_year)
418 .field("headquarters_city", &self.headquarters_city)
419 .field("headquarters_state", &self.headquarters_state)
420 .field("employee_count", &self.employee_count)
421 .field("department_names", &self.department_names)
422 .field("active_project_names", &self.active_project_names)
423 .field("company_metadata", &self.company_metadata)
424 }
425
426 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
427 let name = deserializer.field("name")?;
428 let founded_year = deserializer.field("founded_year")?;
429 let headquarters_city = deserializer.field("headquarters_city")?;
430 let headquarters_state = deserializer.field("headquarters_state")?;
431 let employee_count = deserializer.field("employee_count")?;
432 let department_names = deserializer.field("department_names")?;
433 let active_project_names = deserializer.field("active_project_names")?;
434 let company_metadata = deserializer.field("company_metadata")?;
435
436 Ok(Company {
437 name,
438 founded_year,
439 headquarters_city,
440 headquarters_state,
441 employee_count,
442 department_names,
443 active_project_names,
444 company_metadata,
445 })
446 }
447}
448
449impl ToFieldValue for Company {
450 fn to_field_value(&self) -> FieldValue {
451 match self.to_json() {
452 Ok(json_str) => FieldValue::from_json_object(json_str),
453 Err(_) => FieldValue::from_string("serialization_error".to_string()),
454 }
455 }
456}
457
458impl FromFieldValue for Company {
459 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
460 // Try JSON object first
461 if let Ok(json_data) = value.as_json_object() {
462 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
463 field: field_name.to_string(),
464 message: format!("Failed to deserialize Company from JSON: {}", e),
465 });
466 }
467
468 // Try binary object
469 if let Ok(binary_data) = value.as_binary_object() {
470 return Self::from_binary(binary_data).map_err(|e| {
471 SerializationError::ValidationFailed {
472 field: field_name.to_string(),
473 message: format!("Failed to deserialize Company from binary: {}", e),
474 }
475 });
476 }
477
478 Err(SerializationError::ValidationFailed {
479 field: field_name.to_string(),
480 message: format!(
481 "Expected JsonObject or BinaryObject for Company, found {}",
482 value.type_name()
483 ),
484 })
485 }
486}
487
488/// Department struct
489#[derive(Debug, Clone, PartialEq)]
490pub struct Department {
491 pub name: String,
492 pub manager: String,
493 pub employee_count: u32,
494 pub budget: f64,
495 pub office_locations: Vec<Address>,
496}
497
498impl StructSerializable for Department {
499 fn to_serializer(&self) -> StructSerializer {
500 StructSerializer::new()
501 .field("name", &self.name)
502 .field("manager", &self.manager)
503 .field("employee_count", &self.employee_count)
504 .field("budget", &self.budget)
505 .field("office_locations", &self.office_locations)
506 }
507
508 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
509 let name = deserializer.field("name")?;
510 let manager = deserializer.field("manager")?;
511 let employee_count = deserializer.field("employee_count")?;
512 let budget = deserializer.field("budget")?;
513 let office_locations = deserializer.field("office_locations")?;
514
515 Ok(Department {
516 name,
517 manager,
518 employee_count,
519 budget,
520 office_locations,
521 })
522 }
523}
524
525impl ToFieldValue for Department {
526 fn to_field_value(&self) -> FieldValue {
527 match self.to_json() {
528 Ok(json_str) => FieldValue::from_json_object(json_str),
529 Err(_) => FieldValue::from_string("serialization_error".to_string()),
530 }
531 }
532}
533
534impl FromFieldValue for Department {
535 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
536 // Try JSON object first
537 if let Ok(json_data) = value.as_json_object() {
538 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
539 field: field_name.to_string(),
540 message: format!("Failed to deserialize Department from JSON: {}", e),
541 });
542 }
543
544 // Try binary object
545 if let Ok(binary_data) = value.as_binary_object() {
546 return Self::from_binary(binary_data).map_err(|e| {
547 SerializationError::ValidationFailed {
548 field: field_name.to_string(),
549 message: format!("Failed to deserialize Department from binary: {}", e),
550 }
551 });
552 }
553
554 Err(SerializationError::ValidationFailed {
555 field: field_name.to_string(),
556 message: format!(
557 "Expected JsonObject or BinaryObject for Department, found {}",
558 value.type_name()
559 ),
560 })
561 }
562}
563
564fn main() -> Result<(), Box<dyn std::error::Error>> {
565 println!("=== Nested Structures Serialization Example ===\n");
566
567 demonstrate_nested_struct_creation()?;
568 demonstrate_deep_serialization()?;
569 demonstrate_collection_nesting()?;
570 demonstrate_partial_loading()?;
571 demonstrate_performance_analysis()?;
572 cleanup_temp_files()?;
573
574 println!("\n=== Example completed successfully! ===");
575 Ok(())
576}
577
578/// Demonstrate creating complex nested structures
579fn demonstrate_nested_struct_creation() -> Result<(), Box<dyn std::error::Error>> {
580 println!("--- Nested Structure Creation ---");
581
582 // Create nested address and contact info
583 let headquarters = Address {
584 street: "123 Innovation Drive".to_string(),
585 city: "Tech City".to_string(),
586 state: "CA".to_string(),
587 postal_code: "94000".to_string(),
588 country: "USA".to_string(),
589 };
590
591 let mut social_media = HashMap::new();
592 social_media.insert("twitter".to_string(), "@techcorp".to_string());
593 social_media.insert("linkedin".to_string(), "techcorp-inc".to_string());
594
595 let contact_info = ContactInfo {
596 email: "info@techcorp.com".to_string(),
597 phone: Some("+1-555-0123".to_string()),
598 address_city: headquarters.city.clone(),
599 address_state: headquarters.state.clone(),
600 social_media,
601 };
602
603 // Create departments with nested office locations
604 let engineering_office = Address {
605 street: "456 Developer Lane".to_string(),
606 city: "Code City".to_string(),
607 state: "CA".to_string(),
608 postal_code: "94001".to_string(),
609 country: "USA".to_string(),
610 };
611
612 let departments = [
613 Department {
614 name: "Engineering".to_string(),
615 manager: "Alice Johnson".to_string(),
616 employee_count: 50,
617 budget: 2500000.0,
618 office_locations: vec![engineering_office, headquarters.clone()],
619 },
620 Department {
621 name: "Marketing".to_string(),
622 manager: "Bob Smith".to_string(),
623 employee_count: 15,
624 budget: 800000.0,
625 office_locations: vec![headquarters.clone()],
626 },
627 ];
628
629 // Create projects with milestones
630 let milestones = vec![
631 Milestone {
632 name: "Requirements Analysis".to_string(),
633 description: "Complete system requirements documentation".to_string(),
634 due_date: "2024-03-15".to_string(),
635 is_completed: true,
636 progress_percentage: 100.0,
637 dependencies: vec![],
638 },
639 Milestone {
640 name: "Architecture Design".to_string(),
641 description: "Define system architecture and components".to_string(),
642 due_date: "2024-04-01".to_string(),
643 is_completed: false,
644 progress_percentage: 75.0,
645 dependencies: vec!["Requirements Analysis".to_string()],
646 },
647 ];
648
649 let mut project_metadata = HashMap::new();
650 project_metadata.insert("priority".to_string(), "high".to_string());
651 project_metadata.insert("client".to_string(), "internal".to_string());
652
653 let projects = [Project {
654 name: "Train Station ML Platform".to_string(),
655 description: "Next-generation machine learning infrastructure".to_string(),
656 status: ProjectStatus::InProgress,
657 budget: 1500000.0,
658 team_members: vec![
659 "Alice Johnson".to_string(),
660 "Charlie Brown".to_string(),
661 "Diana Prince".to_string(),
662 ],
663 milestones: milestones.clone(),
664 metadata: project_metadata,
665 }];
666
667 // Create the complete company structure
668 let mut company_metadata = HashMap::new();
669 company_metadata.insert("industry".to_string(), "technology".to_string());
670 company_metadata.insert("stock_symbol".to_string(), "TECH".to_string());
671
672 let company = Company {
673 name: "TechCorp Inc.".to_string(),
674 founded_year: 2015,
675 headquarters_city: headquarters.city.clone(),
676 headquarters_state: headquarters.state.clone(),
677 employee_count: 250,
678 department_names: departments.iter().map(|d| d.name.clone()).collect(),
679 active_project_names: projects.iter().map(|p| p.name.clone()).collect(),
680 company_metadata,
681 };
682
683 println!("Created complex company structure:");
684 println!(" Company: {}", company.name);
685 println!(" Founded: {}", company.founded_year);
686 println!(
687 " Headquarters: {}, {}",
688 company.headquarters_city, company.headquarters_state
689 );
690 println!(" Employee Count: {}", company.employee_count);
691 println!(" Departments: {}", company.department_names.len());
692 println!(" Active Projects: {}", company.active_project_names.len());
693
694 // Save the complete structure
695 company.save_json("temp_nested_company.json")?;
696 println!("Saved nested structure to: temp_nested_company.json");
697
698 // Verify loading preserves all nested data
699 let loaded_company = Company::load_json("temp_nested_company.json")?;
700 assert_eq!(company, loaded_company);
701 println!("Successfully verified Company roundtrip serialization");
702
703 // Also demonstrate individual component serialization
704 let address_json = headquarters.to_json()?;
705 let loaded_address = Address::from_json(&address_json)?;
706 assert_eq!(headquarters, loaded_address);
707 println!("Successfully serialized/deserialized Address component");
708
709 let contact_json = contact_info.to_json()?;
710 let loaded_contact = ContactInfo::from_json(&contact_json)?;
711 assert_eq!(contact_info, loaded_contact);
712 println!("Successfully serialized/deserialized ContactInfo component");
713 println!("Nested structure integrity: VERIFIED");
714
715 Ok(())
716}
717
718/// Demonstrate deep serialization with complex nesting
719fn demonstrate_deep_serialization() -> Result<(), Box<dyn std::error::Error>> {
720 println!("\n--- Deep Serialization Analysis ---");
721
722 let deep_milestone = Milestone {
723 name: "Deep Milestone".to_string(),
724 description: "Testing deep nesting serialization".to_string(),
725 due_date: "2024-12-31".to_string(),
726 is_completed: false,
727 progress_percentage: 50.0,
728 dependencies: vec!["Parent Task".to_string(), "Sibling Task".to_string()],
729 };
730
731 let deep_project = Project {
732 name: "Deep Nesting Test".to_string(),
733 description: "Project for testing serialization depth".to_string(),
734 status: ProjectStatus::Planning,
735 budget: 100000.0,
736 team_members: vec!["Developer 1".to_string(), "Developer 2".to_string()],
737 milestones: vec![deep_milestone],
738 metadata: HashMap::new(),
739 };
740
741 // Analyze serialization output
742 let json_output = deep_project.to_json()?;
743 let binary_output = deep_project.to_binary()?;
744
745 println!("Deep structure serialization analysis:");
746 println!(" JSON size: {} bytes", json_output.len());
747 println!(" Binary size: {} bytes", binary_output.len());
748 println!(" Nesting levels: Address -> Project -> Milestone -> Dependencies");
749
750 // Count nested objects in JSON (rough estimate)
751 let object_count = json_output.matches('{').count();
752 let array_count = json_output.matches('[').count();
753 println!(" JSON objects: {}", object_count);
754 println!(" JSON arrays: {}", array_count);
755
756 // Verify deep roundtrip
757 let json_parsed = Project::from_json(&json_output)?;
758 let binary_parsed = Project::from_binary(&binary_output)?;
759
760 assert_eq!(deep_project, json_parsed);
761 assert_eq!(deep_project, binary_parsed);
762 println!("Deep serialization roundtrip: VERIFIED");
763
764 Ok(())
765}
766
767/// Demonstrate collection nesting patterns
768fn demonstrate_collection_nesting() -> Result<(), Box<dyn std::error::Error>> {
769 println!("\n--- Collection Nesting Patterns ---");
770
771 // Create multiple departments with varying complexity
772 let departments = vec![
773 Department {
774 name: "Research".to_string(),
775 manager: "Dr. Science".to_string(),
776 employee_count: 25,
777 budget: 1200000.0,
778 office_locations: vec![
779 Address {
780 street: "1 Research Blvd".to_string(),
781 city: "Innovation Hub".to_string(),
782 state: "MA".to_string(),
783 postal_code: "02101".to_string(),
784 country: "USA".to_string(),
785 },
786 Address {
787 street: "2 Lab Street".to_string(),
788 city: "Tech Valley".to_string(),
789 state: "NY".to_string(),
790 postal_code: "12180".to_string(),
791 country: "USA".to_string(),
792 },
793 ],
794 },
795 Department {
796 name: "Quality Assurance".to_string(),
797 manager: "Test Master".to_string(),
798 employee_count: 12,
799 budget: 600000.0,
800 office_locations: vec![], // Empty collection
801 },
802 ];
803
804 println!("Collection nesting analysis:");
805 println!(" Departments: {}", departments.len());
806
807 let total_locations: usize = departments.iter().map(|d| d.office_locations.len()).sum();
808 println!(" Total office locations: {}", total_locations);
809
810 // Test serialization with mixed empty and populated collections
811 // Note: Vec<Department> doesn't implement StructSerializable directly.
812 // For this example, we'll serialize each department individually
813 let department_json_strings: Result<Vec<String>, _> =
814 departments.iter().map(|dept| dept.to_json()).collect();
815 let department_json_strings = department_json_strings?;
816
817 // Deserialize each department back
818 let parsed_departments: Result<Vec<Department>, _> = department_json_strings
819 .iter()
820 .map(|json_str| Department::from_json(json_str))
821 .collect();
822 let parsed_departments = parsed_departments?;
823
824 assert_eq!(departments, parsed_departments);
825 println!("Collection nesting serialization: VERIFIED");
826
827 // Analyze collection patterns
828 for (i, dept) in departments.iter().enumerate() {
829 println!(
830 " Department {}: {} locations",
831 i + 1,
832 dept.office_locations.len()
833 );
834 }
835
836 Ok(())
837}
838
839/// Demonstrate partial loading and field access
840fn demonstrate_partial_loading() -> Result<(), Box<dyn std::error::Error>> {
841 println!("\n--- Partial Loading and Field Access ---");
842
843 // Create a simple project for analysis
844 let project = Project {
845 name: "Sample Project".to_string(),
846 description: "For testing partial loading".to_string(),
847 status: ProjectStatus::InProgress,
848 budget: 50000.0,
849 team_members: vec!["Alice".to_string(), "Bob".to_string()],
850 milestones: vec![Milestone {
851 name: "Phase 1".to_string(),
852 description: "Initial phase".to_string(),
853 due_date: "2024-06-01".to_string(),
854 is_completed: true,
855 progress_percentage: 100.0,
856 dependencies: vec![],
857 }],
858 metadata: HashMap::new(),
859 };
860
861 // Convert to JSON and analyze structure
862 println!("Project JSON structure analysis:");
863
864 // Parse to examine available fields by inspecting JSON structure
865 let json_data = project.to_json()?;
866 let field_count = json_data.matches(':').count();
867 println!(" Estimated fields: {}", field_count);
868
869 // Show top-level structure
870 let lines: Vec<&str> = json_data.lines().take(10).collect();
871 println!(" JSON structure preview:");
872 for line in lines.iter().take(5) {
873 if let Some(colon_pos) = line.find(':') {
874 let field_name = line[..colon_pos].trim().trim_matches('"').trim();
875 if !field_name.is_empty() {
876 println!(" - {}", field_name);
877 }
878 }
879 }
880
881 // Demonstrate field type analysis
882 println!("\nField type analysis:");
883 println!(" name: String");
884 println!(" status: Enum -> String");
885 println!(" budget: f64 -> Number");
886 println!(" team_members: Vec<String> -> Array");
887 println!(" milestones: Vec<Milestone> -> Array of Objects");
888
889 Ok(())
890}
891
892/// Demonstrate performance analysis for nested structures
893fn demonstrate_performance_analysis() -> Result<(), Box<dyn std::error::Error>> {
894 println!("\n--- Performance Analysis ---");
895
896 // Create structures of varying complexity
897 let simple_address = Address {
898 street: "123 Main St".to_string(),
899 city: "Anytown".to_string(),
900 state: "ST".to_string(),
901 postal_code: "12345".to_string(),
902 country: "USA".to_string(),
903 };
904
905 let complex_department = Department {
906 name: "Complex Department".to_string(),
907 manager: "Manager Name".to_string(),
908 employee_count: 100,
909 budget: 5000000.0,
910 office_locations: vec![simple_address.clone(); 10], // 10 identical addresses
911 };
912
913 let complex_project = Project {
914 name: "Complex Project".to_string(),
915 description: "Large project with many components".to_string(),
916 status: ProjectStatus::InProgress,
917 budget: 2000000.0,
918 team_members: (1..=50).map(|i| format!("Team Member {}", i)).collect(),
919 milestones: (1..=20)
920 .map(|i| Milestone {
921 name: format!("Milestone {}", i),
922 description: format!("Description for milestone {}", i),
923 due_date: "2024-12-31".to_string(),
924 is_completed: i <= 10,
925 progress_percentage: if i <= 10 { 100.0 } else { 50.0 },
926 dependencies: if i > 1 {
927 vec![format!("Milestone {}", i - 1)]
928 } else {
929 vec![]
930 },
931 })
932 .collect(),
933 metadata: HashMap::new(),
934 };
935
936 // Measure serialization performance
937 println!("Performance comparison:");
938
939 // Simple address
940 let addr_json = simple_address.to_json()?;
941 let addr_binary = simple_address.to_binary()?;
942 println!(" Simple Address:");
943 println!(" JSON: {} bytes", addr_json.len());
944 println!(" Binary: {} bytes", addr_binary.len());
945
946 // Complex department
947 let dept_json = complex_department.to_json()?;
948 let dept_binary = complex_department.to_binary()?;
949 println!(" Complex Department (10 addresses):");
950 println!(" JSON: {} bytes", dept_json.len());
951 println!(" Binary: {} bytes", dept_binary.len());
952
953 // Complex project
954 let proj_json = complex_project.to_json()?;
955 let proj_binary = complex_project.to_binary()?;
956 println!(" Complex Project (50 members, 20 milestones):");
957 println!(" JSON: {} bytes", proj_json.len());
958 println!(" Binary: {} bytes", proj_binary.len());
959
960 // Calculate efficiency ratios
961 let dept_ratio = dept_json.len() as f64 / dept_binary.len() as f64;
962 let proj_ratio = proj_json.len() as f64 / proj_binary.len() as f64;
963
964 println!("\nFormat efficiency (JSON/Binary ratio):");
965 println!(" Department: {:.2}x", dept_ratio);
966 println!(" Project: {:.2}x", proj_ratio);
967
968 // Verify complex structure roundtrip
969 let proj_json_parsed = Project::from_json(&proj_json)?;
970 let proj_binary_parsed = Project::from_binary(&proj_binary)?;
971
972 assert_eq!(complex_project, proj_json_parsed);
973 assert_eq!(complex_project, proj_binary_parsed);
974 println!("Complex structure roundtrip: VERIFIED");
975
976 Ok(())
977}94 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
95 // Try JSON object first
96 if let Ok(json_data) = value.as_json_object() {
97 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
98 field: field_name.to_string(),
99 message: format!("Failed to deserialize VersionedData from JSON: {}", e),
100 });
101 }
102
103 // Try binary object
104 if let Ok(binary_data) = value.as_binary_object() {
105 return Self::from_binary(binary_data).map_err(|e| {
106 SerializationError::ValidationFailed {
107 field: field_name.to_string(),
108 message: format!("Failed to deserialize VersionedData from binary: {}", e),
109 }
110 });
111 }
112
113 Err(SerializationError::ValidationFailed {
114 field: field_name.to_string(),
115 message: format!(
116 "Expected JsonObject or BinaryObject for VersionedData, found {}",
117 value.type_name()
118 ),
119 })
120 }
121}
122
123/// Validated user input with constraints
124#[derive(Debug, Clone, PartialEq)]
125pub struct ValidatedUserInput {
126 pub username: String,
127 pub email: String,
128 pub age: u16,
129 pub preferences: HashMap<String, String>,
130}
131
132impl StructSerializable for ValidatedUserInput {
133 fn to_serializer(&self) -> StructSerializer {
134 StructSerializer::new()
135 .field("username", &self.username)
136 .field("email", &self.email)
137 .field("age", &self.age)
138 .field("preferences", &self.preferences)
139 }
140
141 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
142 let username: String = deserializer.field("username")?;
143 let email: String = deserializer.field("email")?;
144 let age: u16 = deserializer.field("age")?;
145 let preferences: HashMap<String, String> = deserializer.field("preferences")?;
146
147 // Validate username
148 if username.is_empty() || username.len() > 50 {
149 return Err(SerializationError::ValidationFailed {
150 field: "username".to_string(),
151 message: "Username must be 1-50 characters long".to_string(),
152 });
153 }
154
155 if !username
156 .chars()
157 .all(|c| c.is_alphanumeric() || c == '_' || c == '-')
158 {
159 return Err(SerializationError::ValidationFailed {
160 field: "username".to_string(),
161 message:
162 "Username can only contain alphanumeric characters, underscores, and hyphens"
163 .to_string(),
164 });
165 }
166
167 // Validate email (basic check)
168 if !email.contains('@') || !email.contains('.') || email.len() < 5 {
169 return Err(SerializationError::ValidationFailed {
170 field: "email".to_string(),
171 message: "Invalid email format".to_string(),
172 });
173 }
174
175 // Validate age
176 if !(13..=120).contains(&age) {
177 return Err(SerializationError::ValidationFailed {
178 field: "age".to_string(),
179 message: "Age must be between 13 and 120".to_string(),
180 });
181 }
182
183 // Validate preferences
184 if preferences.len() > 20 {
185 return Err(SerializationError::ValidationFailed {
186 field: "preferences".to_string(),
187 message: "Too many preferences (maximum 20)".to_string(),
188 });
189 }
190
191 for (key, value) in &preferences {
192 if key.len() > 50 || value.len() > 200 {
193 return Err(SerializationError::ValidationFailed {
194 field: "preferences".to_string(),
195 message: format!("Preference key/value too long: {}", key),
196 });
197 }
198 }
199
200 Ok(ValidatedUserInput {
201 username,
202 email,
203 age,
204 preferences,
205 })
206 }
207}
208
209impl ToFieldValue for ValidatedUserInput {
210 fn to_field_value(&self) -> FieldValue {
211 match self.to_json() {
212 Ok(json_str) => FieldValue::from_json_object(json_str),
213 Err(_) => FieldValue::from_string("serialization_error".to_string()),
214 }
215 }
216}
217
218impl FromFieldValue for ValidatedUserInput {
219 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
220 // Try JSON object first
221 if let Ok(json_data) = value.as_json_object() {
222 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
223 field: field_name.to_string(),
224 message: format!("Failed to deserialize ValidatedUserInput from JSON: {}", e),
225 });
226 }
227
228 // Try binary object
229 if let Ok(binary_data) = value.as_binary_object() {
230 return Self::from_binary(binary_data).map_err(|e| {
231 SerializationError::ValidationFailed {
232 field: field_name.to_string(),
233 message: format!(
234 "Failed to deserialize ValidatedUserInput from binary: {}",
235 e
236 ),
237 }
238 });
239 }
240
241 Err(SerializationError::ValidationFailed {
242 field: field_name.to_string(),
243 message: format!(
244 "Expected JsonObject or BinaryObject for ValidatedUserInput, found {}",
245 value.type_name()
246 ),
247 })
248 }
249}
250
251/// Recovery helper for handling partial data
252#[derive(Debug, Clone, PartialEq)]
253pub struct RecoverableData {
254 pub critical_field: String,
255 pub important_field: Option<String>,
256 pub optional_field: Option<String>,
257 pub metadata: HashMap<String, String>,
258}
259
260impl StructSerializable for RecoverableData {
261 fn to_serializer(&self) -> StructSerializer {
262 StructSerializer::new()
263 .field("critical_field", &self.critical_field)
264 .field("important_field", &self.important_field)
265 .field("optional_field", &self.optional_field)
266 .field("metadata", &self.metadata)
267 }
268
269 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
270 // Critical field - must exist
271 let critical_field = deserializer.field("critical_field")?;
272
273 // Important field - try to recover if missing
274 let important_field = deserializer.field_optional("important_field")?;
275
276 // Optional field - graceful fallback
277 let optional_field = deserializer.field_optional("optional_field")?;
278
279 // Metadata - recover what we can
280 let metadata = deserializer.field_or("metadata", HashMap::new())?;
281
282 Ok(RecoverableData {
283 critical_field,
284 important_field,
285 optional_field,
286 metadata,
287 })
288 }
289}
290
291impl ToFieldValue for RecoverableData {
292 fn to_field_value(&self) -> FieldValue {
293 match self.to_json() {
294 Ok(json_str) => FieldValue::from_json_object(json_str),
295 Err(_) => FieldValue::from_string("serialization_error".to_string()),
296 }
297 }
298}
299
300impl FromFieldValue for RecoverableData {
301 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
302 // Try JSON object first
303 if let Ok(json_data) = value.as_json_object() {
304 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
305 field: field_name.to_string(),
306 message: format!("Failed to deserialize RecoverableData from JSON: {}", e),
307 });
308 }
309
310 // Try binary object
311 if let Ok(binary_data) = value.as_binary_object() {
312 return Self::from_binary(binary_data).map_err(|e| {
313 SerializationError::ValidationFailed {
314 field: field_name.to_string(),
315 message: format!("Failed to deserialize RecoverableData from binary: {}", e),
316 }
317 });
318 }
319
320 Err(SerializationError::ValidationFailed {
321 field: field_name.to_string(),
322 message: format!(
323 "Expected JsonObject or BinaryObject for RecoverableData, found {}",
324 value.type_name()
325 ),
326 })
327 }87 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
88 // Try JSON object first
89 if let Ok(json_data) = value.as_json_object() {
90 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
91 field: field_name.to_string(),
92 message: format!("Failed to deserialize PerformanceMetrics from JSON: {}", e),
93 });
94 }
95
96 // Try binary object
97 if let Ok(binary_data) = value.as_binary_object() {
98 return Self::from_binary(binary_data).map_err(|e| {
99 SerializationError::ValidationFailed {
100 field: field_name.to_string(),
101 message: format!(
102 "Failed to deserialize PerformanceMetrics from binary: {}",
103 e
104 ),
105 }
106 });
107 }
108
109 Err(SerializationError::ValidationFailed {
110 field: field_name.to_string(),
111 message: format!(
112 "Expected JsonObject or BinaryObject for PerformanceMetrics, found {}",
113 value.type_name()
114 ),
115 })
116 }
117}
118
119/// Large dataset for performance testing
120#[derive(Debug, Clone, PartialEq)]
121pub struct LargeDataset {
122 pub name: String,
123 pub values: Vec<f32>, // Changed from f64 to f32 (supported)
124 pub labels: Vec<String>,
125 pub feature_count: usize, // Simplified from Vec<Vec<f32>> to just a count
126 pub feature_dimension: usize, // Store dimensions separately
127 pub metadata: HashMap<String, String>,
128 pub timestamp_count: usize, // Simplified from Vec<u64> to just count
129}
130
131impl StructSerializable for LargeDataset {
132 fn to_serializer(&self) -> StructSerializer {
133 StructSerializer::new()
134 .field("name", &self.name)
135 .field("values", &self.values)
136 .field("labels", &self.labels)
137 .field("feature_count", &self.feature_count)
138 .field("feature_dimension", &self.feature_dimension)
139 .field("metadata", &self.metadata)
140 .field("timestamp_count", &self.timestamp_count)
141 }
142
143 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
144 let name = deserializer.field("name")?;
145 let values = deserializer.field("values")?;
146 let labels = deserializer.field("labels")?;
147 let feature_count = deserializer.field("feature_count")?;
148 let feature_dimension = deserializer.field("feature_dimension")?;
149 let metadata = deserializer.field("metadata")?;
150 let timestamp_count = deserializer.field("timestamp_count")?;
151
152 Ok(LargeDataset {
153 name,
154 values,
155 labels,
156 feature_count,
157 feature_dimension,
158 metadata,
159 timestamp_count,
160 })
161 }
162}
163
164impl ToFieldValue for LargeDataset {
165 fn to_field_value(&self) -> FieldValue {
166 match self.to_json() {
167 Ok(json_str) => FieldValue::from_json_object(json_str),
168 Err(_) => FieldValue::from_string("serialization_error".to_string()),
169 }
170 }
171}
172
173impl FromFieldValue for LargeDataset {
174 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
175 // Try JSON object first
176 if let Ok(json_data) = value.as_json_object() {
177 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
178 field: field_name.to_string(),
179 message: format!("Failed to deserialize LargeDataset from JSON: {}", e),
180 });
181 }
182
183 // Try binary object
184 if let Ok(binary_data) = value.as_binary_object() {
185 return Self::from_binary(binary_data).map_err(|e| {
186 SerializationError::ValidationFailed {
187 field: field_name.to_string(),
188 message: format!("Failed to deserialize LargeDataset from binary: {}", e),
189 }
190 });
191 }
192
193 Err(SerializationError::ValidationFailed {
194 field: field_name.to_string(),
195 message: format!(
196 "Expected JsonObject or BinaryObject for LargeDataset, found {}",
197 value.type_name()
198 ),
199 })
200 }
201}
202
203/// Configuration data (typical JSON use case)
204#[derive(Debug, Clone, PartialEq)]
205pub struct Configuration {
206 pub version: String,
207 pub debug_enabled: bool,
208 pub log_level: String,
209 pub database_settings: HashMap<String, String>,
210 pub feature_flags_enabled: bool, // Simplified from HashMap<String, bool>
211 pub max_connections: f32, // Simplified from HashMap<String, f64>
212 pub timeout_seconds: f32,
213}
214
215impl StructSerializable for Configuration {
216 fn to_serializer(&self) -> StructSerializer {
217 StructSerializer::new()
218 .field("version", &self.version)
219 .field("debug_enabled", &self.debug_enabled)
220 .field("log_level", &self.log_level)
221 .field("database_settings", &self.database_settings)
222 .field("feature_flags_enabled", &self.feature_flags_enabled)
223 .field("max_connections", &self.max_connections)
224 .field("timeout_seconds", &self.timeout_seconds)
225 }
226
227 fn from_deserializer(deserializer: &mut StructDeserializer) -> SerializationResult<Self> {
228 let version = deserializer.field("version")?;
229 let debug_enabled = deserializer.field("debug_enabled")?;
230 let log_level = deserializer.field("log_level")?;
231 let database_settings = deserializer.field("database_settings")?;
232 let feature_flags_enabled = deserializer.field("feature_flags_enabled")?;
233 let max_connections = deserializer.field("max_connections")?;
234 let timeout_seconds = deserializer.field("timeout_seconds")?;
235
236 Ok(Configuration {
237 version,
238 debug_enabled,
239 log_level,
240 database_settings,
241 feature_flags_enabled,
242 max_connections,
243 timeout_seconds,
244 })
245 }
246}
247
248impl ToFieldValue for Configuration {
249 fn to_field_value(&self) -> FieldValue {
250 match self.to_json() {
251 Ok(json_str) => FieldValue::from_json_object(json_str),
252 Err(_) => FieldValue::from_string("serialization_error".to_string()),
253 }
254 }
255}
256
257impl FromFieldValue for Configuration {
258 fn from_field_value(value: FieldValue, field_name: &str) -> SerializationResult<Self> {
259 // Try JSON object first
260 if let Ok(json_data) = value.as_json_object() {
261 return Self::from_json(json_data).map_err(|e| SerializationError::ValidationFailed {
262 field: field_name.to_string(),
263 message: format!("Failed to deserialize Configuration from JSON: {}", e),
264 });
265 }
266
267 // Try binary object
268 if let Ok(binary_data) = value.as_binary_object() {
269 return Self::from_binary(binary_data).map_err(|e| {
270 SerializationError::ValidationFailed {
271 field: field_name.to_string(),
272 message: format!("Failed to deserialize Configuration from binary: {}", e),
273 }
274 });
275 }
276
277 Err(SerializationError::ValidationFailed {
278 field: field_name.to_string(),
279 message: format!(
280 "Expected JsonObject or BinaryObject for Configuration, found {}",
281 value.type_name()
282 ),
283 })
284 }
285}
286
287/// Format comparison results
288#[derive(Debug)]
289pub struct FormatComparison {
290 pub data_type: String,
291 pub json_size_bytes: u64,
292 pub binary_size_bytes: u64,
293 pub json_serialize_micros: u64,
294 pub binary_serialize_micros: u64,
295 pub json_deserialize_micros: u64,
296 pub binary_deserialize_micros: u64,
297 pub size_ratio: f64,
298 pub serialize_speed_ratio: f64,
299 pub deserialize_speed_ratio: f64,
300}
301
302impl FormatComparison {
303 fn new(data_type: String) -> Self {
304 Self {
305 data_type,
306 json_size_bytes: 0,
307 binary_size_bytes: 0,
308 json_serialize_micros: 0,
309 binary_serialize_micros: 0,
310 json_deserialize_micros: 0,
311 binary_deserialize_micros: 0,
312 size_ratio: 0.0,
313 serialize_speed_ratio: 0.0,
314 deserialize_speed_ratio: 0.0,
315 }
316 }
317
318 fn calculate_ratios(&mut self) {
319 self.size_ratio = self.json_size_bytes as f64 / self.binary_size_bytes as f64;
320 self.serialize_speed_ratio =
321 self.binary_serialize_micros as f64 / self.json_serialize_micros as f64;
322 self.deserialize_speed_ratio =
323 self.binary_deserialize_micros as f64 / self.json_deserialize_micros as f64;
324 }
325}
326
327fn main() -> Result<(), Box<dyn std::error::Error>> {
328 println!("=== JSON vs Binary Format Comparison Example ===\n");
329
330 demonstrate_format_characteristics()?;
331 demonstrate_size_comparisons()?;
332 demonstrate_performance_benchmarks()?;
333 demonstrate_use_case_recommendations()?;
334 demonstrate_debugging_capabilities()?;
335 cleanup_temp_files()?;
336
337 println!("\n=== Example completed successfully! ===");
338 Ok(())
339}
340
341/// Demonstrate basic format characteristics
342fn demonstrate_format_characteristics() -> Result<(), Box<dyn std::error::Error>> {
343 println!("--- Format Characteristics ---");
344
345 // Create sample data structures
346 let mut metadata = HashMap::new();
347 metadata.insert("operation_type".to_string(), "benchmark".to_string());
348 metadata.insert("system".to_string(), "train_station".to_string());
349
350 let metrics = PerformanceMetrics {
351 operation: "tensor_multiplication".to_string(),
352 duration_micros: 1234,
353 memory_usage_bytes: 8192,
354 cpu_usage_percent: 75.5,
355 throughput_ops_per_sec: 1000.0,
356 metadata,
357 };
358
359 println!("Format characteristics analysis:");
360
361 // JSON characteristics
362 let json_data = metrics.to_json()?;
363 let json_lines = json_data.lines().count();
364 let json_chars = json_data.chars().count();
365
366 println!("\nJSON Format:");
367 println!(" Size: {} bytes", json_data.len());
368 println!(" Characters: {}", json_chars);
369 println!(" Lines: {}", json_lines);
370 println!(" Human readable: Yes");
371 println!(" Self-describing: Yes");
372 println!(" Cross-platform: Yes");
373 println!(" Compression ratio: Variable (depends on content)");
374
375 // Show sample JSON output
376 println!(" Sample output:");
377 for line in json_data.lines().take(3) {
378 println!(" {}", line);
379 }
380 if json_lines > 3 {
381 println!(" ... ({} more lines)", json_lines - 3);
382 }
383
384 // Binary characteristics
385 let binary_data = metrics.to_binary()?;
386
387 println!("\nBinary Format:");
388 println!(" Size: {} bytes", binary_data.len());
389 println!(" Human readable: No");
390 println!(" Self-describing: No (requires schema)");
391 println!(" Cross-platform: Yes (with proper endianness handling)");
392 println!(" Compression ratio: High (efficient encoding)");
393
394 // Show sample binary output (hex)
395 println!(" Sample output (first 32 bytes as hex):");
396 print!(" ");
397 for (i, byte) in binary_data.iter().take(32).enumerate() {
398 if i > 0 && i % 16 == 0 {
399 println!();
400 print!(" ");
401 }
402 print!("{:02x} ", byte);
403 }
404 if binary_data.len() > 32 {
405 println!("\n ... ({} more bytes)", binary_data.len() - 32);
406 } else {
407 println!();
408 }
409
410 // Verify roundtrip for both formats
411 let json_parsed = PerformanceMetrics::from_json(&json_data)?;
412 let binary_parsed = PerformanceMetrics::from_binary(&binary_data)?;
413
414 assert_eq!(metrics, json_parsed);
415 assert_eq!(metrics, binary_parsed);
416 println!("\nRoundtrip verification: PASSED");
417
418 Ok(())
419}
420
421/// Demonstrate size comparisons across different data types
422fn demonstrate_size_comparisons() -> Result<(), Box<dyn std::error::Error>> {
423 println!("\n--- Size Comparison Analysis ---");
424
425 // Test 1: Small configuration data (typical JSON use case)
426 let mut db_settings = HashMap::new();
427 db_settings.insert("host".to_string(), "localhost".to_string());
428 db_settings.insert("port".to_string(), "5432".to_string());
429 db_settings.insert("database".to_string(), "myapp".to_string());
430
431 let config = Configuration {
432 version: "1.2.3".to_string(),
433 debug_enabled: true,
434 log_level: "info".to_string(),
435 database_settings: db_settings,
436 feature_flags_enabled: true,
437 max_connections: 100.0,
438 timeout_seconds: 30.0,
439 };
440
441 // Test 2: Large numeric dataset (typical binary use case)
442 let large_dataset = LargeDataset {
443 name: "ML Training Data".to_string(),
444 values: (0..1000).map(|i| i as f32 * 0.1).collect(),
445 labels: (0..1000).map(|i| format!("label_{}", i)).collect(),
446 feature_count: 100,
447 feature_dimension: 50,
448 timestamp_count: 1000,
449 metadata: HashMap::new(),
450 };
451
452 println!("Size comparison results:");
453
454 // Configuration comparison
455 let config_json = config.to_json()?;
456 let config_binary = config.to_binary()?;
457
458 println!("\nConfiguration Data (small, text-heavy):");
459 println!(" JSON: {} bytes", config_json.len());
460 println!(" Binary: {} bytes", config_binary.len());
461 println!(
462 " Ratio (JSON/Binary): {:.2}x",
463 config_json.len() as f64 / config_binary.len() as f64
464 );
465 println!(" Recommendation: JSON (human readable, small size difference)");
466
467 // Large dataset comparison
468 let dataset_json = large_dataset.to_json()?;
469 let dataset_binary = large_dataset.to_binary()?;
470
471 println!("\nLarge Numeric Dataset (1000 values, 100x50 matrix):");
472 println!(
473 " JSON: {} bytes ({:.1} KB)",
474 dataset_json.len(),
475 dataset_json.len() as f64 / 1024.0
476 );
477 println!(
478 " Binary: {} bytes ({:.1} KB)",
479 dataset_binary.len(),
480 dataset_binary.len() as f64 / 1024.0
481 );
482 println!(
483 " Ratio (JSON/Binary): {:.2}x",
484 dataset_json.len() as f64 / dataset_binary.len() as f64
485 );
486 if dataset_json.len() > dataset_binary.len() {
487 println!(
488 " Space saved with binary: {} bytes ({:.1} KB)",
489 dataset_json.len() - dataset_binary.len(),
490 (dataset_json.len() - dataset_binary.len()) as f64 / 1024.0
491 );
492 println!(" Recommendation: Binary (significant size reduction)");
493 } else {
494 println!(
495 " Binary overhead: {} bytes ({:.1} KB)",
496 dataset_binary.len() - dataset_json.len(),
497 (dataset_binary.len() - dataset_json.len()) as f64 / 1024.0
498 );
499 println!(" Recommendation: JSON (binary overhead not justified for this size)");
500 }
501
502 // Content analysis
503 println!("\nContent Type Analysis:");
504
505 // Analyze JSON content patterns
506 let json_numbers = dataset_json.matches(char::is_numeric).count();
507 let json_brackets = dataset_json.matches('[').count() + dataset_json.matches(']').count();
508 let json_quotes = dataset_json.matches('"').count();
509
510 println!(" JSON overhead sources:");
511 println!(" Numeric characters: ~{}", json_numbers);
512 println!(" Brackets and commas: ~{}", json_brackets);
513 println!(" Quote marks: {}", json_quotes);
514 println!(" Formatting/whitespace: Varies");
515
516 println!(" Binary advantages:");
517 println!(" Direct numeric encoding: 4-8 bytes per number");
518 println!(" No formatting overhead: Zero bytes");
519 println!(" Efficient length encoding: Minimal bytes");
520
521 Ok(())
522}
523
524/// Demonstrate performance benchmarks
525fn demonstrate_performance_benchmarks() -> Result<(), Box<dyn std::error::Error>> {
526 println!("\n--- Performance Benchmark Analysis ---");
527
528 // Create test data of varying sizes
529 let small_config = Configuration {
530 version: "1.0.0".to_string(),
531 debug_enabled: false,
532 log_level: "warn".to_string(),
533 database_settings: HashMap::new(),
534 feature_flags_enabled: false,
535 max_connections: 100.0,
536 timeout_seconds: 30.0,
537 };
538
539 let large_dataset = LargeDataset {
540 name: "Large Dataset".to_string(),
541 values: (0..5000).map(|i| i as f32 * 0.001).collect(),
542 labels: (0..5000).map(|i| format!("large_item_{}", i)).collect(),
543 feature_count: 200,
544 feature_dimension: 25,
545 timestamp_count: 5000,
546 metadata: HashMap::new(),
547 };
548
549 println!("Performance benchmark results:");
550
551 // Benchmark each dataset (avoiding trait objects due to object safety)
552 let dataset_names = ["Small Config", "Large Dataset"];
553
554 for (i, name) in dataset_names.iter().enumerate() {
555 let mut comparison = FormatComparison::new(name.to_string());
556
557 // JSON serialization benchmark
558 let start = Instant::now();
559 let json_data = match i {
560 0 => small_config.to_json()?,
561 _ => large_dataset.to_json()?,
562 };
563 comparison.json_serialize_micros = start.elapsed().as_micros() as u64;
564 comparison.json_size_bytes = json_data.len() as u64;
565
566 // JSON deserialization benchmark (using PerformanceMetrics as example)
567 if *name == "Small Config" {
568 let start = Instant::now();
569 let _parsed = Configuration::from_json(&json_data)?;
570 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
571 } else {
572 let start = Instant::now();
573 let _parsed = LargeDataset::from_json(&json_data)?;
574 comparison.json_deserialize_micros = start.elapsed().as_micros() as u64;
575 }
576
577 // Binary serialization benchmark
578 let start = Instant::now();
579 let binary_data = match i {
580 0 => small_config.to_binary()?,
581 _ => large_dataset.to_binary()?,
582 };
583 comparison.binary_serialize_micros = start.elapsed().as_micros() as u64;
584 comparison.binary_size_bytes = binary_data.len() as u64;
585
586 // Binary deserialization benchmark
587 if *name == "Small Config" {
588 let start = Instant::now();
589 let _parsed = Configuration::from_binary(&binary_data)?;
590 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
591 } else {
592 let start = Instant::now();
593 let _parsed = LargeDataset::from_binary(&binary_data)?;
594 comparison.binary_deserialize_micros = start.elapsed().as_micros() as u64;
595 }
596
597 // Calculate ratios
598 comparison.calculate_ratios();
599
600 // Display results
601 println!("\n{}:", name);
602 println!(
603 " Size - JSON: {} bytes, Binary: {} bytes (ratio: {:.2}x)",
604 comparison.json_size_bytes, comparison.binary_size_bytes, comparison.size_ratio
605 );
606 println!(
607 " Serialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
608 comparison.json_serialize_micros,
609 comparison.binary_serialize_micros,
610 comparison.serialize_speed_ratio
611 );
612 println!(
613 " Deserialize - JSON: {}μs, Binary: {}μs (binary relative speed: {:.2}x)",
614 comparison.json_deserialize_micros,
615 comparison.binary_deserialize_micros,
616 comparison.deserialize_speed_ratio
617 );
618 }
619
620 println!("\nPerformance Summary:");
621 println!(" - Binary format consistently uses less storage space");
622 println!(" - Performance differences vary by data type and size");
623 println!(" - Larger datasets show more significant binary advantages");
624 println!(" - JSON parsing overhead increases with structure complexity");
625
626 Ok(())
627}Dyn Compatibility§
This trait is not dyn compatible.
In older versions of Rust, dyn compatibility was called "object safety", so this trait is not object safe.