Expand description
Defines the Pipeline
struct and related components for processing
blockchain data updates.
The Pipeline
module is central to the carbon-core
framework, offering a
flexible and extensible data processing architecture that supports various
blockchain data types, including account updates, transaction details, and
account deletions. The pipeline integrates multiple data sources and
processing pipes to handle and transform incoming data, while recording
performance metrics for monitoring and analysis.
§Overview
This module provides the Pipeline
struct, which orchestrates data flow
from multiple sources, processes it through designated pipes, and captures
metrics at each stage. The pipeline is highly customizable and can be
configured with various components to suit specific data handling
requirements.
§Key Components
- Datasources: Provide raw data updates, which may include account or transaction details.
- Account, Instruction, and Transaction Pipes: Modular units that decode and process specific types of data. Account pipes handle account updates, instruction pipes process instructions within transactions, and transaction pipes manage complete transaction records.
- Metrics: Collects data on pipeline performance, such as processing times and error rates, providing insights into operational efficiency.
§Fields and Configuration
- datasources: A list of
Datasource
objects that act as the sources for account and transaction data. - account_pipes: A collection of pipes for processing account updates.
- account_deletion_pipes: Pipes responsible for handling account deletion events.
- instruction_pipes: Used to process instructions within transactions.
- transaction_pipes: For handling full transactions.
- metrics: A vector of
Metrics
implementations that gather and report on performance data. - metrics_flush_interval: Specifies how frequently metrics are flushed. Defaults to 5 seconds if unset.
§Notes
- Each pipe and data source must implement the appropriate traits
(
Datasource
,AccountPipes
,Metrics
, etc.). - The
Pipeline
is designed for concurrent operation, withArc
andBox
wrappers ensuring safe, shared access. - Proper metric collection and flushing are essential for monitoring pipeline performance, especially in production environments.
Structs§
- Pipeline
- Represents the primary data processing pipeline in the
carbon-core
framework. - Pipeline
Builder - A builder for constructing a
Pipeline
instance with customized data sources, processing pipes, and metrics.
Enums§
- Shutdown
Strategy - Defines the shutdown behavior for the pipeline.
Constants§
- DEFAULT_
CHANNEL_ BUFFER_ SIZE - The default size of the channel buffer for the pipeline.