pub async fn execute_with_client(
args: MatchArgs,
ai_client: Box<dyn AIProvider>,
config: &Config,
) -> Result<()>
Expand description
Execute the matching workflow with dependency-injected AI client.
This function implements the core matching logic while accepting an AI client as a parameter, enabling dependency injection for testing and allowing different AI provider implementations to be used.
§Architecture Benefits
- Testability: Mock AI clients can be injected for unit testing
- Flexibility: Different AI providers can be used without code changes
- Isolation: Core logic is independent of AI client implementation
- Reusability: Function can be called with custom AI configurations
§Matching Process
- Configuration Setup: Load matching parameters and thresholds
- Engine Initialization: Create matching engine with AI client
- File Discovery: Scan for video and subtitle files
- Content Analysis: Extract and analyze subtitle content
- AI Matching: Send content to AI service for correlation analysis
- Result Processing: Evaluate confidence and generate operations
- Operation Execution: Apply file changes or save dry-run results
§Dry-run vs Live Mode
§Dry-run Mode (args.dry_run = true
)
- No actual file modifications are performed
- Results are cached for potential later application
- Operations are displayed for user review
- Safe for testing and verification
§Live Mode (args.dry_run = false
)
- File operations are actually executed
- Backups are created if enabled
- Changes are applied atomically where possible
- Progress is tracked and displayed
§Arguments
args
- Command-line arguments with matching configurationai_client
- AI provider implementation for content analysis
§Returns
Returns Ok(())
on successful completion or an error describing
the failure point in the matching workflow.
§Error Handling
The function provides comprehensive error handling:
- Early Validation: Configuration and argument validation
- Graceful Degradation: Partial completion when possible
- Clear Messaging: Descriptive error messages for user guidance
- State Preservation: No partial file modifications on errors
§Caching Strategy
- AI Results: Cached to reduce API costs and improve performance
- Content Analysis: Subtitle parsing results cached per file
- Match Results: Dry-run results saved for later application
- Configuration: Processed configuration cached for efficiency
§Examples
ⓘ
use subx_cli::commands::match_command;
use subx_cli::cli::MatchArgs;
use subx_cli::services::ai::MockAIClient;
use std::path::PathBuf;
// Testing with mock AI client
let mock_client = Box::new(MockAIClient::new());
let args = MatchArgs {
path: PathBuf::from("./test_data"),
recursive: false,
dry_run: true,
confidence: 90,
backup: false,
};
match_command::execute_with_client(args, mock_client, &config).await?;