yellowstone_jet_tpu_client/lib.rs
1//!
2//! Yellowstone jet-tpu-client
3//!
4//! This crate is port of the custom TPU-QUIC client used by [Yellowstone Jet](https://github.com/rpcpool/yellowstone-jet)
5//! a subsystem of [Cascade-Marketplace](https://triton.one/cascade),
6//!
7//! This crates expose a generic TPU sender implementation [TpuSender](`crate::sender::TpuSender`) that can be used with different
8//! TPU info services, stake info services, eviction strategies, and leader schedule predictors.
9//!
10//! The cores async event-loop engine uses [quinn] and [tokio] crates to provide a high-performance QUIC-based transport protocol implementation.
11//! It is designed to handle the latest Agave network changes and covers all the edge-cases observed in production usage:
12//!
13//! 1. Automatic leader schedule tracking and slot updates
14//! 2. Automatic TPU contact-info handling:
15//! - Contact info discovery using latest gossip information from the network.
16//! - Handles TPU endpoint changes due to leader contact info flapping (e.g. Jito validators)
17//! 3. Automic connection manamgent: reconnect, connection-prediction, failures handling.
18//! 4. Rescue transaction on connection dropped (e.g. due to remote peer connection eviction)
19//! 5. Stake-aware TPU selection and eviction strategies.
20//!
21//! ## `YellowstoneTpuSender` : Smart TPU sender implementation
22//!
23//! This crate come with a _smart_ TPU sender implementation: [YellowstoneTpuSender](`crate::yellowstone_grpc::sender::YellowstoneTpuSender`)
24//!
25//! This sender implementation supports three different sending strategies:
26//!
27//! 1. Send transaction to one or more remote peers
28//! 2. Send to the current leader
29//! 3. Send to the the curent leader AND the next `N-1` leaders in the schedule.
30//!
31//! The sender automatically tracks the current slot and leader schedule.
32//!
33//! ## Example
34//!
35//! See [repository](https://github.com/rpcpool/yellowstone-jet/blob/main/crates/tpu-client/src/bin/test-tpu-send.rs) for more examples.
36//!
37//! # feature-flag supports
38//!
39//! - **prometheus**: Enable prometheus metrics exposition module [`crate::prom`]
40//! - **yellowstone-grpc**: Enable Yellowstone gRPC based TPU sender implementation [`crate::yellowstone_grpc`]
41//! - **bytes** : Enable `bytes` crate based transaction representation support in TPU sender
42//!
43///
44/// module for top-level cnfiguration objects
45///
46pub mod config;
47///
48/// module for the core tpu sending driver logic
49///
50pub mod core;
51///
52/// module for common tpu sender implementation
53///
54pub mod sender;
55
56///
57/// module to enable prometheus metrics exposition
58///
59#[cfg(feature = "prometheus")]
60pub mod prom;
61
62///
63/// module for RPC utilities
64///
65pub mod rpc;
66
67///
68/// module for slot tracking
69///
70pub mod slot;
71
72///
73/// module to host utility that utilize Yellowstone gRPC services
74///
75#[cfg(feature = "yellowstone-grpc")]
76pub mod yellowstone_grpc;