execute

Function execute 

Source
pub async fn execute(
    args: MatchArgs,
    config_service: &dyn ConfigService,
) -> Result<()>
Expand description

Execute the AI-powered subtitle matching operation with full workflow.

This is the main entry point for the match command, which orchestrates the entire matching process from configuration loading through file operations. It automatically creates the appropriate AI client based on configuration settings and delegates to the core matching logic.

§Process Overview

  1. Configuration Loading: Load user and system configuration
  2. AI Client Creation: Initialize AI provider based on settings
  3. Matching Execution: Delegate to core matching implementation
  4. Result Processing: Handle results and display output

§Configuration Integration

The function automatically loads configuration from multiple sources:

  • System-wide configuration files
  • User-specific configuration directory
  • Environment variables
  • Command-line argument overrides

§AI Provider Selection

AI client creation is based on configuration settings:

[ai]
provider = "openai"  # or "anthropic", "local", etc.
openai.api_key = "sk-..."
openai.model = "gpt-4-turbo-preview"

§Arguments

  • args - Parsed command-line arguments containing:
    • path: Directory or file path to process
    • recursive: Whether to scan subdirectories
    • dry_run: Preview mode without actual file changes
    • confidence: Minimum confidence threshold (0-100)
    • backup: Enable automatic file backups

§Returns

Returns Ok(()) on successful completion, or an error containing:

  • Configuration loading failures
  • AI client initialization problems
  • Matching operation errors
  • File system operation failures

§Errors

Common error conditions include:

  • Configuration Error: Invalid or missing configuration files
  • AI Service Error: API authentication or connectivity issues
  • File System Error: Permission or disk space problems
  • Content Error: Invalid or corrupted subtitle files
  • Network Error: Connection issues with AI services

§Examples

use subx_cli::cli::MatchArgs;
use subx_cli::commands::match_command;
use std::path::PathBuf;

// Basic matching with default settings
let args = MatchArgs {
    path: PathBuf::from("./media"),
    recursive: true,
    dry_run: false,
    confidence: 85,
    backup: true,
};

match_command::execute(args).await?;

// Dry-run mode for preview
let preview_args = MatchArgs {
    path: PathBuf::from("./test_media"),
    recursive: false,
    dry_run: true,
    confidence: 70,
    backup: false,
};

match_command::execute(preview_args).await?;

§Performance Considerations

  • Caching: AI results are automatically cached to reduce API costs
  • Batch Processing: Multiple files processed efficiently in parallel
  • Rate Limiting: Automatic throttling to respect AI service limits
  • Memory Management: Streaming processing for large file sets