sql-cli 1.73.1

SQL query tool for CSV/JSON with both interactive TUI and non-interactive CLI modes - perfect for exploration and automation
Documentation
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
# WEB CTE Design Document

## Overview
WEB CTEs enable SQL CLI to fetch data directly from HTTP/HTTPS endpoints and use it as Common Table Expressions in SQL queries. This transforms SQL CLI into a powerful data integration tool that can combine local data with live API data.

## Core Concept
```sql
-- Basic syntax: Fetch CSV data from a URL
WITH WEB sales_data AS (
    URL 'https://api.example.com/sales.csv'
    FORMAT CSV
),
local_targets AS (
    SELECT * FROM targets_table
)
SELECT
    s.region,
    s.revenue,
    t.target,
    (s.revenue / t.target) * 100 AS achievement_pct
FROM sales_data s
JOIN local_targets t ON s.region = t.region;
```

## Syntax Design

### Basic WEB CTE Syntax
```sql
WITH WEB cte_name AS (
    URL 'https://endpoint.com/data'
    [FORMAT CSV|JSON]                    -- Data format (default: auto-detect)
    [CACHE duration]                      -- Cache results for N seconds
    [HEADERS (key = 'value', ...)]       -- HTTP headers
    [METHOD GET|POST]                     -- HTTP method (default: GET)
    [BODY 'request body']                 -- Request body for POST
    [TIMEOUT seconds]                     -- Request timeout
)
```

### Phase 1: Basic Implementation
```sql
-- Simple CSV fetch
WITH WEB weather AS (
    URL 'https://api.weather.com/current.csv'
    FORMAT CSV
)
SELECT * FROM weather WHERE temp > 20;

-- Simple JSON fetch
WITH WEB users AS (
    URL 'https://api.github.com/users/torvalds/repos'
    FORMAT JSON
)
SELECT name, stargazers_count FROM users
ORDER BY stargazers_count DESC;
```

### Phase 2: Authentication Support
```sql
-- Bearer token authentication
WITH WEB protected_data AS (
    URL 'https://api.company.com/data'
    HEADERS (
        Authorization = 'Bearer ${TOKEN}',  -- From environment variable
        Accept = 'application/json'
    )
    FORMAT JSON
)
SELECT * FROM protected_data;

-- API key authentication
WITH WEB api_data AS (
    URL 'https://api.service.com/v1/data'
    HEADERS (
        'X-API-Key' = '${API_KEY}'
    )
)
SELECT * FROM api_data;
```

### Phase 3: Advanced Features
```sql
-- POST request with body
WITH WEB graphql_result AS (
    URL 'https://api.github.com/graphql'
    METHOD POST
    HEADERS (
        Authorization = 'Bearer ${GITHUB_TOKEN}',
        'Content-Type' = 'application/json'
    )
    BODY '{
        "query": "{ viewer { login repositories(first: 5) { nodes { name } } } }"
    }'
    FORMAT JSON
)
SELECT * FROM graphql_result;

-- Caching for expensive APIs
WITH WEB fx_rates AS (
    URL 'https://api.exchangerate.com/latest'
    CACHE 3600  -- Cache for 1 hour
    FORMAT JSON
)
SELECT * FROM fx_rates;
```

## Implementation Architecture

### 1. Parser Extensions
```rust
// Add to ast.rs
pub enum CTEType {
    Standard(SelectStatement),
    Web(WebCTESpec),
}

pub struct WebCTESpec {
    pub url: String,
    pub format: DataFormat,
    pub headers: HashMap<String, String>,
    pub method: HttpMethod,
    pub body: Option<String>,
    pub cache_duration: Option<Duration>,
    pub timeout: Option<Duration>,
}

pub enum DataFormat {
    CSV,
    JSON,
    Auto,  // Auto-detect from Content-Type or extension
}
```

### 2. HTTP Client Module
```rust
// New module: src/web/http_client.rs
pub struct WebDataFetcher {
    client: reqwest::Client,
    cache: Arc<Mutex<LruCache<String, CachedResponse>>>,
}

impl WebDataFetcher {
    pub async fn fetch(&self, spec: &WebCTESpec) -> Result<DataTable> {
        // Check cache first
        if let Some(cached) = self.get_cached(&spec.url) {
            return Ok(cached);
        }

        // Build request
        let mut request = self.client
            .request(spec.method, &spec.url)
            .timeout(spec.timeout.unwrap_or(Duration::from_secs(30)));

        // Add headers
        for (key, value) in &spec.headers {
            request = request.header(key, self.resolve_value(value)?);
        }

        // Add body if POST
        if let Some(body) = &spec.body {
            request = request.body(body.clone());
        }

        // Execute request
        let response = request.send().await?;

        // Parse response based on format
        let data_table = match spec.format {
            DataFormat::CSV => self.parse_csv(response).await?,
            DataFormat::JSON => self.parse_json(response).await?,
            DataFormat::Auto => self.auto_parse(response).await?,
        };

        // Cache if specified
        if let Some(duration) = spec.cache_duration {
            self.cache_response(&spec.url, &data_table, duration);
        }

        Ok(data_table)
    }

    fn resolve_value(&self, value: &str) -> Result<String> {
        // Handle ${VAR} environment variable substitution
        if value.starts_with("${") && value.ends_with("}") {
            let var_name = &value[2..value.len()-1];
            std::env::var(var_name)
                .map_err(|_| anyhow!("Environment variable {} not found", var_name))
        } else {
            Ok(value.to_string())
        }
    }
}
```

### 3. Data Parsing
```rust
// CSV parsing
async fn parse_csv(&self, response: Response) -> Result<DataTable> {
    let text = response.text().await?;
    let mut reader = csv::Reader::from_reader(text.as_bytes());

    // Build DataTable from CSV
    let headers = reader.headers()?.clone();
    let mut table = DataTable::new("web_data");

    // Add columns
    for header in headers.iter() {
        table.add_column(DataColumn {
            name: header.to_string(),
            data_type: DataType::String,  // Infer later
            // ...
        });
    }

    // Add rows
    for result in reader.records() {
        let record = result?;
        table.add_row(/* convert record to DataRow */);
    }

    Ok(table)
}

// JSON parsing
async fn parse_json(&self, response: Response) -> Result<DataTable> {
    let json: serde_json::Value = response.json().await?;

    // Handle both array of objects and single object
    let rows = match json {
        Value::Array(arr) => arr,
        Value::Object(_) => vec![json],
        _ => return Err(anyhow!("JSON must be object or array")),
    };

    // Infer schema from first few rows
    let schema = infer_json_schema(&rows[..rows.len().min(10)]);

    // Build DataTable
    let mut table = DataTable::new("web_data");
    for (col_name, col_type) in schema {
        table.add_column(/* ... */);
    }

    // Populate rows
    for row in rows {
        table.add_row(/* extract values based on schema */);
    }

    Ok(table)
}
```

## Authentication Strategy

### Phase 1: Environment Variables
```bash
# Set tokens in environment
export API_TOKEN="sk-abc123..."
export GITHUB_TOKEN="ghp_xyz789..."

# Use in SQL
sql-cli -q "
WITH WEB data AS (
    URL 'https://api.service.com/data'
    HEADERS (Authorization = 'Bearer \${API_TOKEN}')
)
SELECT * FROM data"
```

### Phase 2: Configuration File
```toml
# ~/.config/sql-cli/web_auth.toml
[tokens]
github = "ghp_xyz789..."
openai = "sk-abc123..."

[endpoints."api.company.com"]
auth_type = "bearer"
token_env = "COMPANY_API_TOKEN"

[endpoints."data.service.com"]
auth_type = "api_key"
header_name = "X-API-Key"
key_env = "SERVICE_API_KEY"
```

### Phase 3: Dynamic Token Fetching
```sql
-- Fetch token from another endpoint first
WITH WEB auth AS (
    URL 'https://auth.company.com/token'
    METHOD POST
    BODY '{"client_id": "${CLIENT_ID}", "client_secret": "${CLIENT_SECRET}"}'
),
WEB data AS (
    URL 'https://api.company.com/data'
    HEADERS (Authorization = 'Bearer ' || (SELECT token FROM auth))
)
SELECT * FROM data;
```

## Use Cases

### 1. FX Rate Integration
```sql
WITH WEB fx_rates AS (
    URL 'https://api.exchangerate-api.com/v4/latest/USD'
    CACHE 3600
    FORMAT JSON
),
local_transactions AS (
    SELECT * FROM transactions
)
SELECT
    t.id,
    t.amount_usd,
    t.target_currency,
    t.amount_usd * fx.rates[t.target_currency] AS converted_amount
FROM local_transactions t
CROSS JOIN fx_rates fx;
```

### 2. Multi-Source Data Join
```sql
WITH WEB github_repos AS (
    URL 'https://api.github.com/users/torvalds/repos'
    FORMAT JSON
),
WEB github_user AS (
    URL 'https://api.github.com/users/torvalds'
    FORMAT JSON
)
SELECT
    u.name AS author,
    u.public_repos AS total_repos,
    r.name AS repo_name,
    r.stargazers_count
FROM github_user u
CROSS JOIN github_repos r
ORDER BY r.stargazers_count DESC
LIMIT 10;
```

### 3. Real-time Weather Analysis
```sql
WITH WEB current_weather AS (
    URL 'https://api.openweathermap.org/data/2.5/weather?q=London&appid=${WEATHER_API_KEY}'
    FORMAT JSON
),
historical_weather AS (
    SELECT * FROM weather_history WHERE city = 'London'
)
SELECT
    cw.main.temp AS current_temp,
    AVG(hw.temperature) AS historical_avg,
    cw.main.temp - AVG(hw.temperature) AS deviation
FROM current_weather cw
CROSS JOIN historical_weather hw
WHERE hw.date >= DATE_SUB(NOW(), INTERVAL 30 DAY);
```

## Security Considerations

1. **URL Whitelisting**: Option to restrict URLs to specific domains
2. **Timeout Protection**: Default timeout to prevent hanging requests
3. **Size Limits**: Maximum response size to prevent memory exhaustion
4. **Rate Limiting**: Built-in rate limiting per domain
5. **SSL Verification**: Enforce HTTPS for sensitive data
6. **Token Security**: Never log or display authentication tokens

## Configuration Options
```toml
# ~/.config/sql-cli/web.toml
[security]
allowed_domains = ["api.company.com", "*.trusted-partner.com"]
require_https = true
max_response_size_mb = 100
default_timeout_seconds = 30

[cache]
default_duration_seconds = 300
max_cache_size_mb = 500

[rate_limiting]
requests_per_minute = 60
burst_size = 10
```

## Error Handling

- Network errors: Retry with exponential backoff
- Parse errors: Provide helpful error messages with sample data
- Authentication errors: Clear indication of auth failure
- Rate limit errors: Automatic retry after delay
- Timeout errors: Suggest increasing timeout

## Performance Optimizations

1. **Connection Pooling**: Reuse HTTP connections
2. **Parallel Fetching**: Fetch multiple WEB CTEs concurrently
3. **Smart Caching**: Cache based on URL + headers
4. **Streaming**: Stream large responses instead of loading all into memory
5. **Compression**: Support gzip/deflate response compression

## Testing Strategy

1. **Mock Server**: Test suite with mock HTTP server
2. **Real APIs**: Integration tests with public APIs (GitHub, etc.)
3. **Error Scenarios**: Test network failures, timeouts, bad data
4. **Performance**: Benchmark with various response sizes
5. **Security**: Test token handling, URL validation

## Implementation Phases

### Phase 1: Basic GET with CSV/JSON (Week 1)
- [ ] Parser support for WEB keyword
- [ ] Basic HTTP GET client
- [ ] CSV parsing
- [ ] JSON flattening
- [ ] Simple integration tests

### Phase 2: Headers and Environment Variables (Week 2)
- [ ] Header support in parser
- [ ] Environment variable substitution
- [ ] Bearer token authentication
- [ ] API key authentication
- [ ] Cache implementation

### Phase 3: Advanced Features (Week 3)
- [ ] POST/PUT/DELETE methods
- [ ] Request body support
- [ ] Dynamic token fetching
- [ ] Rate limiting
- [ ] Comprehensive error handling

### Phase 4: Production Hardening (Week 4)
- [ ] Configuration file support
- [ ] URL whitelisting
- [ ] Performance optimizations
- [ ] Security audit
- [ ] Documentation and examples