1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
// Licensed to the Apache Software Foundation (ASF) under one
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
// to you under the Apache License, Version 2.0 (the
// "License"); you may not use this file except in compliance
// with the License. You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.
//! Registry of physical expressions that support nested list column pushdown
//! to the Parquet decoder.
//!
//! This module provides a trait-based approach for determining which predicates
//! can be safely evaluated on nested list columns during Parquet decoding.
use Arc;
use ;
use ;
/// Trait for physical expressions that support list column pushdown during
/// Parquet decoding.
///
/// This trait provides a type-safe mechanism for identifying expressions that
/// can be safely pushed down to the Parquet decoder for evaluation on nested
/// list columns.
///
/// # Implementation Notes
///
/// Expression types in external crates cannot directly implement this trait
/// due to Rust's orphan rules. Instead, we use a blanket implementation that
/// delegates to a registration mechanism.
///
/// # Examples
///
/// ```ignore
/// use datafusion_physical_expr::PhysicalExpr;
/// use datafusion_datasource_parquet::SupportsListPushdown;
///
/// let expr: Arc<dyn PhysicalExpr> = ...;
/// if expr.supports_list_pushdown() {
/// // Can safely push down to Parquet decoder
/// }
/// ```
/// Blanket implementation for all physical expressions.
///
/// This delegates to specialized predicates that check whether the concrete
/// expression type is registered as supporting list pushdown. This design
/// allows the trait to work with expression types defined in external crates.
/// Checks if an expression is a NULL or NOT NULL check.
///
/// These checks are universally supported for all column types.
/// Checks if an expression is a scalar function registered for list pushdown.
///
/// Returns `true` if the expression is a `ScalarFunctionExpr` whose function
/// is in the registry of supported operations.
/// Checks whether the given physical expression contains a supported nested
/// predicate (for example, `array_has_all`).
///
/// This function recursively traverses the expression tree to determine if
/// any node contains predicates that support list column pushdown to the
/// Parquet decoder.
///
/// # Supported predicates
///
/// - `IS NULL` and `IS NOT NULL` checks on any column type
/// - Array functions: `array_has`, `array_has_all`, `array_has_any`
///
/// # Returns
///
/// `true` if the expression or any of its children contain supported predicates.