agpm_cli/resolver/mod.rs
1//! Dependency resolution and conflict detection for AGPM.
2//!
3//! This module implements the core dependency resolution algorithm that transforms
4//! manifest dependencies into locked versions. It handles version constraint solving,
5//! conflict detection, transitive dependency resolution,
6//! parallel source synchronization, and relative path preservation during installation.
7//!
8//! # Architecture Overview
9//!
10//! The resolver operates using a **two-phase architecture** optimized for SHA-based worktree caching:
11//!
12//! ## Phase 1: Source Synchronization (`pre_sync_sources`)
13//! - **Purpose**: Perform all Git network operations upfront during "Syncing sources" phase
14//! - **Operations**: Clone/fetch repositories, update refs, populate cache
15//! - **Benefits**: Clear progress reporting, batch network operations, error isolation
16//! - **Result**: All required repositories cached locally for phase 2
17//!
18//! ## Phase 2: Version Resolution (`resolve` or `update`)
19//! - **Purpose**: Resolve versions to commit SHAs using cached repositories
20//! - **Operations**: Parse dependencies, resolve constraints, detect conflicts, create worktrees, preserve paths
21//! - **Benefits**: Fast local operations, no network I/O, deterministic behavior
22//! - **Result**: Locked dependencies ready for installation with preserved directory structure
23//!
24//! This two-phase approach replaces the previous three-phase model and provides better
25//! separation of concerns between network operations and dependency resolution logic.
26//!
27//! ## Algorithm Complexity
28//!
29//! - **Time**: O(n + s·log(t)) where:
30//! - n = number of dependencies
31//! - s = number of unique sources
32//! - t = average number of tags/branches per source
33//! - **Space**: O(n + s) for dependency graph and source cache
34//!
35//! ## Parallel Processing
36//!
37//! The resolver leverages async/await for concurrent operations:
38//! - Sources are synchronized in parallel using [`tokio::spawn`]
39//! - Git operations are batched to minimize network roundtrips
40//! - Progress reporting provides real-time feedback on long-running operations
41//!
42//! # Resolution Process
43//!
44//! The two-phase dependency resolution follows these steps:
45//!
46//! ## Phase 1: Source Synchronization
47//! 1. **Dependency Collection**: Extract all dependencies from manifest
48//! 2. **Source Validation**: Verify all referenced sources exist and are accessible
49//! 3. **Repository Preparation**: Use [`version_resolver::VersionResolver`] to collect unique sources
50//! 4. **Source Synchronization**: Clone/update source repositories with single fetch per repository
51//! 5. **Cache Population**: Store bare repository paths for phase 2 operations
52//!
53//! ## Phase 2: Version Resolution & Installation
54//! 1. **Batch SHA Resolution**: Resolve all collected versions to commit SHAs using cached repositories
55//! 2. **SHA-based Worktree Creation**: Create worktrees keyed by commit SHA for maximum deduplication
56//! 3. **Conflict Detection**: Check for path conflicts and incompatible versions
57//! 4. **Redundancy Analysis**: Identify duplicate resources across sources
58//! 5. **Path Processing**: Preserve directory structure by using paths directly from dependencies
59//! 6. **Resource Installation**: Copy resources to target locations with checksums
60//! 7. **Lockfile Generation**: Create deterministic lockfile entries with resolved SHAs and preserved paths
61//!
62//! This separation ensures all network operations complete in phase 1, while phase 2
63//! operates entirely on cached data for fast, deterministic resolution.
64//!
65//! ## Version Resolution Strategy
66//!
67//! Version constraints are resolved using the following precedence:
68//! 1. **Exact commits**: SHA hashes are used directly
69//! 2. **Tags**: Semantic version tags (e.g., `v1.2.3`) are preferred
70//! 3. **Branches**: Branch heads are resolved to current commits
71//! 4. **Latest**: Defaults to the default branch (usually `main` or `master`)
72//!
73//! # Conflict Detection
74//!
75//! The resolver detects several types of conflicts:
76//!
77//! ## Version Conflicts
78//! ```toml
79//! # Incompatible version constraints for the same resource
80//! [agents]
81//! app = { source = "community", path = "agents/helper.md", version = "v1.0.0" }
82//! tool = { source = "community", path = "agents/helper.md", version = "v2.0.0" }
83//! ```
84//!
85//! ## Path Conflicts
86//! ```toml
87//! # Different resources installing to the same location
88//! [agents]
89//! helper-v1 = { source = "old", path = "agents/helper.md" }
90//! helper-v2 = { source = "new", path = "agents/helper.md" }
91//! ```
92//!
93//! ## Source Conflicts
94//! When the same resource path exists in multiple sources with different content,
95//! the resolver uses source precedence (global config sources override local manifest sources).
96//!
97//! # Security Considerations
98//!
99//! The resolver implements several security measures:
100//!
101//! - **Input Validation**: All Git references are validated before checkout
102//! - **Path Sanitization**: Installation paths are validated to prevent directory traversal
103//! - **Credential Isolation**: Authentication tokens are never stored in manifest files
104//! - **Checksum Verification**: Resources are checksummed for integrity validation
105//!
106//! # Performance Optimizations
107//!
108//! - **SHA-based Worktree Caching**: Worktrees keyed by commit SHA maximize reuse across versions
109//! - **Batch Version Resolution**: All versions resolved to SHAs upfront via [`version_resolver::VersionResolver`]
110//! - **Single Fetch Per Repository**: Command-instance fetch caching eliminates redundant network operations
111//! - **Source Caching**: Git repositories are cached globally in `~/.agpm/cache/`
112//! - **Incremental Updates**: Only modified sources are re-synchronized
113//! - **Parallel Operations**: Source syncing and version resolution run concurrently
114//! - **Progress Batching**: UI updates are throttled to prevent performance impact
115//!
116//! # Error Handling
117//!
118//! The resolver provides detailed error context for common failure scenarios:
119//!
120//! - **Network Issues**: Graceful handling of Git clone/fetch failures
121//! - **Authentication**: Clear error messages for credential problems
122//! - **Version Mismatches**: Specific guidance for constraint resolution failures
123//! - **Path Issues**: Detailed information about file system conflicts
124//!
125//! # Example Usage
126//!
127//! ## Two-Phase Resolution Pattern
128//! ```rust,no_run
129//! use agpm_cli::resolver::DependencyResolver;
130//! use agpm_cli::manifest::Manifest;
131//! use agpm_cli::cache::Cache;
132//! use std::path::Path;
133//!
134//! # async fn example() -> anyhow::Result<()> {
135//! let manifest = Manifest::load(Path::new("agpm.toml"))?;
136//! let cache = Cache::new()?;
137//! let mut resolver = DependencyResolver::new_with_global(manifest.clone(), cache).await?;
138//!
139//! // Get all dependencies from manifest
140//! let deps: Vec<(String, agpm_cli::manifest::ResourceDependency)> = manifest
141//! .all_dependencies()
142//! .into_iter()
143//! .map(|(name, dep)| (name.to_string(), dep.clone()))
144//! .collect();
145//!
146//! // Phase 1: Sync all required sources (network operations)
147//! resolver.pre_sync_sources(&deps).await?;
148//!
149//! // Phase 2: Resolve dependencies using cached repositories (local operations)
150//! let lockfile = resolver.resolve().await?;
151//!
152//! println!("Resolved {} agents and {} snippets",
153//! lockfile.agents.len(), lockfile.snippets.len());
154//! # Ok(())
155//! # }
156//! ```
157//!
158//! ## Update Pattern
159//! ```rust,no_run
160//! # use agpm_cli::resolver::DependencyResolver;
161//! # use agpm_cli::manifest::Manifest;
162//! # use agpm_cli::cache::Cache;
163//! # use agpm_cli::lockfile::LockFile;
164//! # use std::path::Path;
165//! # async fn update_example() -> anyhow::Result<()> {
166//! let manifest = Manifest::load(Path::new("agpm.toml"))?;
167//! let mut lockfile = LockFile::load(Path::new("agpm.lock"))?;
168//! let cache = Cache::new()?;
169//! let mut resolver = DependencyResolver::with_cache(manifest.clone(), cache);
170//!
171//! // Get dependencies to update
172//! let deps: Vec<(String, agpm_cli::manifest::ResourceDependency)> = manifest
173//! .all_dependencies()
174//! .into_iter()
175//! .map(|(name, dep)| (name.to_string(), dep.clone()))
176//! .collect();
177//!
178//! // Phase 1: Sync sources for update
179//! resolver.pre_sync_sources(&deps).await?;
180//!
181//! // Phase 2: Update specific dependencies
182//! resolver.update(&mut lockfile, None).await?;
183//!
184//! lockfile.save(Path::new("agpm.lock"))?;
185//! # Ok(())
186//! # }
187//! ```
188//!
189//! ## Incremental Updates
190//! ```rust,no_run
191//! use agpm_cli::resolver::DependencyResolver;
192//! use agpm_cli::lockfile::LockFile;
193//! use agpm_cli::cache::Cache;
194//! use std::path::Path;
195//!
196//! # async fn update_example() -> anyhow::Result<()> {
197//! let existing = LockFile::load("agpm.lock".as_ref())?;
198//! let manifest = agpm_cli::manifest::Manifest::load("agpm.toml".as_ref())?;
199//! let cache = Cache::new()?;
200//! let mut resolver = DependencyResolver::new(manifest, cache)?;
201//!
202//! // Update specific dependencies only
203//! let deps_to_update = vec!["agent1".to_string(), "snippet2".to_string()];
204//! let deps_count = deps_to_update.len();
205//! let updated = resolver.update(&existing, Some(deps_to_update)).await?;
206//!
207//! println!("Updated {} dependencies", deps_count);
208//! # Ok(())
209//! # }
210//! ```
211
212pub mod dependency_graph;
213pub mod version_resolution;
214pub mod version_resolver;
215
216use crate::cache::Cache;
217use crate::core::{AgpmError, ResourceType};
218use crate::git::GitRepo;
219use crate::lockfile::{LockFile, LockedResource};
220use crate::manifest::{Manifest, ResourceDependency};
221use crate::metadata::MetadataExtractor;
222use crate::source::SourceManager;
223use crate::utils::{compute_relative_install_path, normalize_path, normalize_path_for_storage};
224use crate::version::conflict::ConflictDetector;
225use anyhow::{Context, Result};
226use std::collections::{HashMap, HashSet};
227use std::path::{Path, PathBuf};
228use std::time::Duration;
229
230use self::dependency_graph::{DependencyGraph, DependencyNode};
231use self::version_resolver::VersionResolver;
232
233/// Key for identifying unique dependencies in maps and sets.
234///
235/// Combines resource type, name, source, and tool to uniquely identify a dependency.
236/// This prevents collisions between same-named resources from different sources or
237/// using different tools.
238type DependencyKey = (ResourceType, String, Option<String>, Option<String>);
239
240/// Extract meaningful path structure from a dependency path.
241///
242/// This function handles three cases:
243/// 1. Relative paths with `../` - strips all parent directory components
244/// 2. Absolute paths - resolves `..` components and strips root/prefix
245/// 3. Clean relative paths - uses as-is with normalized separators
246///
247/// # Examples
248///
249/// ```
250/// use std::path::Path;
251/// # use agpm_cli::resolver::extract_meaningful_path;
252///
253/// // Relative paths with parent navigation
254/// assert_eq!(extract_meaningful_path(Path::new("../../snippets/dir/file.md")), "snippets/dir/file.md");
255///
256/// // Absolute paths (root stripped, .. resolved) - Unix-style path
257/// #[cfg(unix)]
258/// assert_eq!(extract_meaningful_path(Path::new("/tmp/foo/../bar/agent.md")), "tmp/bar/agent.md");
259///
260/// // Absolute paths (root stripped, .. resolved) - Windows-style path
261/// #[cfg(windows)]
262/// assert_eq!(extract_meaningful_path(Path::new("C:\\tmp\\foo\\..\\bar\\agent.md")), "tmp/bar/agent.md");
263///
264/// // Clean relative paths
265/// assert_eq!(extract_meaningful_path(Path::new("agents/test.md")), "agents/test.md");
266/// ```
267pub fn extract_meaningful_path(path: &Path) -> String {
268 let components: Vec<_> = path.components().collect();
269
270 if path.is_absolute() {
271 // Case 2: Absolute path - resolve ".." components first, then strip root
272 let mut resolved = Vec::new();
273
274 for component in components.iter() {
275 match component {
276 std::path::Component::Normal(name) => {
277 resolved.push(name.to_str().unwrap_or(""));
278 }
279 std::path::Component::ParentDir => {
280 // Pop the last component if there is one
281 resolved.pop();
282 }
283 // Skip RootDir, Prefix, and CurDir
284 _ => {}
285 }
286 }
287
288 resolved.join("/")
289 } else if components.iter().any(|c| matches!(c, std::path::Component::ParentDir)) {
290 // Case 1: Relative path with "../" - skip all parent components
291 let start_idx = components
292 .iter()
293 .position(|c| matches!(c, std::path::Component::Normal(_)))
294 .unwrap_or(0);
295
296 components[start_idx..]
297 .iter()
298 .filter_map(|c| c.as_os_str().to_str())
299 .collect::<Vec<_>>()
300 .join("/")
301 } else {
302 // Case 3: Clean relative path - use as-is
303 path.to_str().unwrap_or("").replace('\\', "/") // Normalize to forward slashes
304 }
305}
306
307/// Type alias for resource lookup key: (`ResourceType`, name, source).
308///
309/// Used internally by the resolver to uniquely identify resources across different sources.
310/// This enables precise lookups when multiple resources share the same name but come from
311/// different sources (common with transitive dependencies).
312///
313/// # Components
314///
315/// * `ResourceType` - The type of resource (Agent, Snippet, Command, Script, Hook, `McpServer`)
316/// * `String` - The resource name as defined in the manifest
317/// * `Option<String>` - The source name (None for local resources without a source)
318///
319/// # Usage
320///
321/// This key is used in `HashMap<ResourceKey, ResourceInfo>` to build a complete map
322/// of all resources in the lockfile for efficient cross-source dependency resolution.
323type ResourceKey = (crate::core::ResourceType, String, Option<String>);
324
325/// Type alias for resource information: (source, version).
326///
327/// Stores the source and version information for a resolved resource. Used in conjunction
328/// with [`ResourceKey`] to enable efficient lookups during transitive dependency resolution.
329///
330/// # Components
331///
332/// * First `Option<String>` - The source name (None for local resources)
333/// * Second `Option<String>` - The version constraint (None for unpinned resources)
334///
335/// # Usage
336///
337/// Paired with [`ResourceKey`] in a `HashMap<ResourceKey, ResourceInfo>` to store
338/// resource metadata for cross-source dependency resolution. This allows the resolver
339/// to construct fully-qualified dependency references like `source:type/name:version`.
340type ResourceInfo = (Option<String>, Option<String>);
341
342/// Read a file with retry logic to handle cross-process filesystem cache coherency issues.
343///
344/// This function wraps `std::fs::read_to_string` with retry logic to handle cases where
345/// files created by Git subprocesses are not immediately visible to the parent Rust process
346/// due to filesystem cache propagation delays. This is particularly important in CI
347/// environments with network-attached storage where cache coherency delays can be significant.
348///
349/// # Arguments
350///
351/// * `path` - The file path to read
352///
353/// # Returns
354///
355/// Returns the file content as a `String`, or an error if the file cannot be read after retries.
356///
357/// # Retry Strategy
358///
359/// - Initial delay: 10ms
360/// - Max delay: 500ms (capped exponential backoff)
361/// - Max attempts: 10
362/// - Total max time: ~5 seconds
363///
364/// Only `NotFound` errors are retried, as these indicate cache coherency issues.
365/// Other errors (permissions, I/O errors) fail immediately.
366fn read_with_cache_retry_sync(path: &Path) -> Result<String> {
367 use std::io;
368 use std::thread;
369
370 let mut attempts = 0;
371 const MAX_ATTEMPTS: u32 = 10;
372
373 loop {
374 match std::fs::read_to_string(path) {
375 Ok(content) => return Ok(content),
376 Err(e) if e.kind() == io::ErrorKind::NotFound && attempts < MAX_ATTEMPTS => {
377 attempts += 1;
378 // Exponential backoff: 10ms, 20ms, 40ms, 80ms, 160ms, 320ms, 500ms (capped)
379 let delay_ms = std::cmp::min(10 * (1 << attempts), 500);
380 let delay = Duration::from_millis(delay_ms);
381
382 tracing::debug!(
383 "File not yet visible (attempt {}/{}): {}",
384 attempts,
385 MAX_ATTEMPTS,
386 path.display()
387 );
388
389 thread::sleep(delay);
390 }
391 Err(e) => {
392 return Err(anyhow::anyhow!("Failed to read file: {}", path.display()).context(e));
393 }
394 }
395 }
396}
397
398/// Core dependency resolver that transforms manifest dependencies into lockfile entries.
399///
400/// The [`DependencyResolver`] is the main entry point for dependency resolution.
401/// It manages source repositories, resolves version constraints, detects conflicts,
402/// and generates deterministic lockfile entries using a centralized SHA-based
403/// resolution strategy for optimal performance.
404///
405/// # SHA-Based Resolution Workflow
406///
407/// Starting in v0.3.2, the resolver uses [`VersionResolver`] for centralized version
408/// resolution that minimizes Git operations and maximizes worktree reuse:
409/// 1. **Collection**: Gather all (source, version) pairs from dependencies
410/// 2. **Batch Resolution**: Resolve all versions to commit SHAs in parallel
411/// 3. **SHA-Based Worktrees**: Create worktrees keyed by commit SHA
412/// 4. **Deduplication**: Multiple refs to same SHA share one worktree
413///
414/// # Configuration
415///
416/// The resolver can be configured in several ways:
417/// - **Standard**: Uses manifest sources only via [`new()`]
418/// - **Global**: Includes global config sources via [`new_with_global()`]
419/// - **Custom Cache**: Uses custom cache directory via [`with_cache()`]
420///
421/// # Thread Safety
422///
423/// The resolver is not thread-safe due to its mutable state during resolution.
424/// Create separate instances for concurrent operations.
425///
426/// [`new()`]: DependencyResolver::new
427/// [`new_with_global()`]: DependencyResolver::new_with_global
428/// [`with_cache()`]: DependencyResolver::with_cache
429pub struct DependencyResolver {
430 manifest: Manifest,
431 /// Manages Git repository operations, source URL resolution, and authentication.
432 ///
433 /// The source manager handles:
434 /// - Mapping source names to Git repository URLs
435 /// - Git operations (clone, fetch, checkout) for dependency resolution
436 /// - Authentication token management for private repositories
437 /// - Source validation and configuration management
438 pub source_manager: SourceManager,
439 cache: Cache,
440 /// Cached per-(source, version) preparation results built during the
441 /// analysis stage so individual dependency resolution can reuse worktrees
442 /// without triggering additional sync operations.
443 prepared_versions: HashMap<String, PreparedSourceVersion>,
444 /// Centralized version resolver for efficient SHA-based dependency resolution.
445 ///
446 /// The `VersionResolver` handles the crucial first phase of dependency resolution
447 /// by batch-resolving all version specifications to commit SHAs before any worktree
448 /// operations. This strategy enables maximum worktree reuse and minimal Git operations.
449 ///
450 /// Used by [`prepare_remote_groups`] to resolve all dependencies upfront.
451 version_resolver: VersionResolver,
452 /// Dependency graph tracking which resources depend on which others.
453 ///
454 /// Maps from (`resource_type`, name, source, tool) to a list of dependencies in the format
455 /// "`resource_type/name`". This is populated during transitive dependency
456 /// resolution and used to fill the dependencies field in `LockedResource` entries.
457 /// The source and tool are included to prevent cross-source and cross-tool dependency contamination.
458 dependency_map: HashMap<DependencyKey, Vec<String>>,
459 /// Conflict detector for identifying version conflicts.
460 ///
461 /// Tracks version requirements across all dependencies (direct and transitive)
462 /// and detects incompatible version constraints before lockfile creation.
463 conflict_detector: ConflictDetector,
464 /// Maps resource names to their original pattern alias for pattern-expanded dependencies.
465 ///
466 /// When a pattern dependency (e.g., `all-helpers = { path = "agents/helpers/*.md" }`)
467 /// expands to multiple concrete dependencies (helper-alpha, helper-beta, etc.), this
468 /// map tracks which pattern alias each expanded resource came from. This enables
469 /// patches defined under the pattern alias to be correctly applied to all matched resources.
470 ///
471 /// Example: If "all-helpers" expands to "helper-alpha" and "helper-beta", this map contains:
472 /// - (ResourceType::Agent, "helper-alpha") -> "all-helpers"
473 /// - (ResourceType::Agent, "helper-beta") -> "all-helpers"
474 ///
475 /// The key includes ResourceType to prevent collisions when different resource types
476 /// have the same concrete name (e.g., agents/deploy.md and commands/deploy.md).
477 pattern_alias_map: HashMap<(ResourceType, String), String>,
478}
479
480#[derive(Clone, Debug, Default)]
481struct PreparedSourceVersion {
482 worktree_path: PathBuf,
483 resolved_version: Option<String>,
484 resolved_commit: String,
485}
486
487#[derive(Clone, Debug)]
488#[allow(dead_code)]
489struct PreparedGroupDescriptor {
490 key: String,
491 source: String,
492 requested_version: Option<String>,
493 version_key: String,
494}
495
496impl DependencyResolver {
497 fn group_key(source: &str, version: &str) -> String {
498 format!("{source}::{version}")
499 }
500
501 /// Adds or updates a resource entry in the lockfile based on resource type.
502 ///
503 /// This helper method eliminates code duplication between the `resolve()` and `update()`
504 /// methods by centralizing lockfile entry management logic. It automatically determines
505 /// the resource type from the entry name and adds or updates the entry in the appropriate
506 /// collection within the lockfile.
507 ///
508 /// The method performs upsert behavior - if an entry with matching name and source
509 /// already exists in the appropriate collection, it will be updated (including version);
510 /// otherwise, a new entry is added. This allows version updates (e.g., v1.0 → v2.0)
511 /// to replace the existing entry rather than creating duplicates.
512 ///
513 /// # Arguments
514 ///
515 /// * `lockfile` - Mutable reference to the lockfile to modify
516 /// * `name` - The name of the resource entry (used to determine resource type)
517 /// * `entry` - The [`LockedResource`] entry to add or update
518 ///
519 /// # Resource Type Detection
520 ///
521 /// Resource types are threaded from the manifest throughout the resolution pipeline,
522 /// which maps to the following lockfile collections:
523 /// - `"agent"` → `lockfile.agents`
524 /// - `"snippet"` → `lockfile.snippets`
525 /// - `"command"` → `lockfile.commands`
526 /// - `"script"` → `lockfile.scripts`
527 /// - `"hook"` → `lockfile.hooks`
528 /// - `"mcp-server"` → `lockfile.mcp_servers`
529 ///
530 /// # Example
531 ///
532 /// ```ignore
533 /// # use agpm_cli::lockfile::{LockFile, LockedResource};
534 /// # use agpm_cli::core::ResourceType;
535 /// # use agpm_cli::resolver::DependencyResolver;
536 /// # let resolver = DependencyResolver::new();
537 /// let mut lockfile = LockFile::new();
538 /// let entry = LockedResource {
539 /// name: "my-agent".to_string(),
540 /// source: Some("github".to_string()),
541 /// url: Some("https://github.com/org/repo.git".to_string()),
542 /// path: "agents/my-agent.md".to_string(),
543 /// version: Some("v1.0.0".to_string()),
544 /// resolved_commit: Some("abc123def456...".to_string()),
545 /// checksum: "sha256:a1b2c3d4...".to_string(),
546 /// installed_at: ".claude/agents/my-agent.md".to_string(),
547 /// dependencies: vec![],
548 /// resource_type: ResourceType::Agent,
549 /// };
550 ///
551 /// // Automatically adds to agents collection based on resource type detection
552 /// resolver.add_or_update_lockfile_entry(&mut lockfile, "my-agent", entry);
553 /// assert_eq!(lockfile.agents.len(), 1);
554 ///
555 /// // Subsequent calls update the existing entry
556 /// let updated_entry = LockedResource {
557 /// name: "my-agent".to_string(),
558 /// version: Some("v1.1.0".to_string()),
559 /// // ... other fields
560 /// # source: Some("github".to_string()),
561 /// # url: Some("https://github.com/org/repo.git".to_string()),
562 /// # path: "agents/my-agent.md".to_string(),
563 /// # resolved_commit: Some("def456789abc...".to_string()),
564 /// # checksum: "sha256:b2c3d4e5...".to_string()),
565 /// # installed_at: ".claude/agents/my-agent.md".to_string(),
566 /// dependencies: vec![],
567 /// resource_type: ResourceType::Agent,
568 /// };
569 /// resolver.add_or_update_lockfile_entry(&mut lockfile, "my-agent", updated_entry);
570 /// assert_eq!(lockfile.agents.len(), 1); // Still one entry, but updated
571 /// ```
572 fn add_or_update_lockfile_entry(
573 &self,
574 lockfile: &mut LockFile,
575 _name: &str,
576 entry: LockedResource,
577 ) {
578 // Get the appropriate resource collection based on the entry's type
579 let resources = lockfile.get_resources_mut(entry.resource_type);
580
581 // Use (name, source, tool) matching for deduplication
582 // This allows multiple entries with the same name from different sources or tools,
583 // which will be caught by conflict detection if they map to the same path
584 if let Some(existing) = resources
585 .iter_mut()
586 .find(|e| e.name == entry.name && e.source == entry.source && e.tool == entry.tool)
587 {
588 *existing = entry;
589 } else {
590 resources.push(entry);
591 }
592 }
593
594 /// Removes stale lockfile entries that are no longer in the manifest.
595 ///
596 /// This method removes lockfile entries for direct manifest dependencies that have been
597 /// commented out or removed from the manifest. This must be called BEFORE
598 /// `remove_manifest_entries_for_update()` to ensure stale entries don't cause conflicts
599 /// during resolution.
600 ///
601 /// A manifest-level entry is identified by:
602 /// - `manifest_alias.is_none()` - Direct dependency with no pattern expansion
603 /// - `manifest_alias.is_some()` - Pattern-expanded dependency (alias must be in manifest)
604 ///
605 /// For each stale entry found, this also removes its transitive children to maintain
606 /// lockfile consistency.
607 ///
608 /// # Arguments
609 ///
610 /// * `lockfile` - The mutable lockfile to clean
611 ///
612 /// # Examples
613 ///
614 /// If a user comments out an agent in agpm.toml:
615 /// ```toml
616 /// # [agents]
617 /// # example = { source = "community", path = "agents/example.md", version = "v1.0.0" }
618 /// ```
619 ///
620 /// This function will remove the "example" agent from the lockfile and all its transitive
621 /// dependencies before the update process begins.
622 fn remove_stale_manifest_entries(&self, lockfile: &mut LockFile) {
623 use std::collections::HashSet;
624
625 // Collect all current manifest keys for each resource type
626 let manifest_agents: HashSet<String> =
627 self.manifest.agents.keys().map(|k| k.to_string()).collect();
628 let manifest_snippets: HashSet<String> =
629 self.manifest.snippets.keys().map(|k| k.to_string()).collect();
630 let manifest_commands: HashSet<String> =
631 self.manifest.commands.keys().map(|k| k.to_string()).collect();
632 let manifest_scripts: HashSet<String> =
633 self.manifest.scripts.keys().map(|k| k.to_string()).collect();
634 let manifest_hooks: HashSet<String> =
635 self.manifest.hooks.keys().map(|k| k.to_string()).collect();
636 let manifest_mcp_servers: HashSet<String> =
637 self.manifest.mcp_servers.keys().map(|k| k.to_string()).collect();
638
639 // Helper to get the right manifest keys for a resource type
640 let get_manifest_keys = |resource_type: crate::core::ResourceType| match resource_type {
641 crate::core::ResourceType::Agent => &manifest_agents,
642 crate::core::ResourceType::Snippet => &manifest_snippets,
643 crate::core::ResourceType::Command => &manifest_commands,
644 crate::core::ResourceType::Script => &manifest_scripts,
645 crate::core::ResourceType::Hook => &manifest_hooks,
646 crate::core::ResourceType::McpServer => &manifest_mcp_servers,
647 };
648
649 // Collect (name, source) pairs to remove
650 let mut entries_to_remove: HashSet<(String, Option<String>)> = HashSet::new();
651 let mut direct_entries: Vec<(String, Option<String>)> = Vec::new();
652
653 // Find all manifest-level entries that are no longer in the manifest
654 for resource_type in crate::core::ResourceType::all() {
655 let manifest_keys = get_manifest_keys(*resource_type);
656 let resources = lockfile.get_resources(*resource_type);
657
658 for entry in resources {
659 // Determine if this is a stale manifest-level entry (no longer in manifest)
660 let is_stale = if let Some(ref alias) = entry.manifest_alias {
661 // Pattern-expanded entry: stale if alias is NOT in manifest
662 !manifest_keys.contains(alias)
663 } else {
664 // Direct entry: stale if name is NOT in manifest
665 !manifest_keys.contains(&entry.name)
666 };
667
668 if is_stale {
669 let key = (entry.name.clone(), entry.source.clone());
670 entries_to_remove.insert(key.clone());
671 direct_entries.push(key);
672 }
673 }
674 }
675
676 // For each stale entry, recursively collect its transitive children
677 for (parent_name, parent_source) in direct_entries {
678 for resource_type in crate::core::ResourceType::all() {
679 if let Some(parent_entry) = lockfile
680 .get_resources(*resource_type)
681 .iter()
682 .find(|e| e.name == parent_name && e.source == parent_source)
683 {
684 Self::collect_transitive_children(
685 lockfile,
686 parent_entry,
687 &mut entries_to_remove,
688 );
689 }
690 }
691 }
692
693 // Remove all marked entries
694 let should_remove = |entry: &crate::lockfile::LockedResource| {
695 entries_to_remove.contains(&(entry.name.clone(), entry.source.clone()))
696 };
697
698 lockfile.agents.retain(|entry| !should_remove(entry));
699 lockfile.snippets.retain(|entry| !should_remove(entry));
700 lockfile.commands.retain(|entry| !should_remove(entry));
701 lockfile.scripts.retain(|entry| !should_remove(entry));
702 lockfile.hooks.retain(|entry| !should_remove(entry));
703 lockfile.mcp_servers.retain(|entry| !should_remove(entry));
704 }
705
706 /// Removes lockfile entries for manifest dependencies that will be re-resolved.
707 ///
708 /// This method removes old entries for direct manifest dependencies before updating,
709 /// which handles the case where a dependency's source or resource type changes.
710 /// This prevents duplicate entries with the same name but different sources.
711 ///
712 /// Pattern-expanded and transitive dependencies are preserved because:
713 /// - Pattern expansions will be re-added during resolution with (name, source) matching
714 /// - Transitive dependencies aren't manifest keys and won't be removed
715 ///
716 /// # Arguments
717 ///
718 /// * `lockfile` - The mutable lockfile to clean
719 /// * `manifest_keys` - Set of manifest dependency keys being updated
720 fn remove_manifest_entries_for_update(
721 &self,
722 lockfile: &mut LockFile,
723 manifest_keys: &HashSet<String>,
724 ) {
725 use std::collections::HashSet;
726
727 // Collect (name, source) pairs to remove
728 // We use (name, source) tuples to distinguish same-named resources from different sources
729 let mut entries_to_remove: HashSet<(String, Option<String>)> = HashSet::new();
730
731 // Step 1: Find direct manifest entries and collect them for transitive traversal
732 let mut direct_entries: Vec<(String, Option<String>)> = Vec::new();
733
734 for resource_type in crate::core::ResourceType::all() {
735 let resources = lockfile.get_resources(*resource_type);
736 for entry in resources {
737 // Check if this entry originates from a manifest key being updated
738 if manifest_keys.contains(&entry.name)
739 || entry
740 .manifest_alias
741 .as_ref()
742 .is_some_and(|alias| manifest_keys.contains(alias))
743 {
744 let key = (entry.name.clone(), entry.source.clone());
745 entries_to_remove.insert(key.clone());
746 direct_entries.push(key);
747 }
748 }
749 }
750
751 // Step 2: For each direct entry, recursively collect its transitive children
752 // This ensures that when "agent-A" changes from repo1 to repo2, we also remove
753 // all transitive dependencies that came from repo1 via agent-A
754 for (parent_name, parent_source) in direct_entries {
755 // Find the parent entry in the lockfile
756 for resource_type in crate::core::ResourceType::all() {
757 if let Some(parent_entry) = lockfile
758 .get_resources(*resource_type)
759 .iter()
760 .find(|e| e.name == parent_name && e.source == parent_source)
761 {
762 // Walk its dependency tree
763 Self::collect_transitive_children(
764 lockfile,
765 parent_entry,
766 &mut entries_to_remove,
767 );
768 }
769 }
770 }
771
772 // Step 3: Remove all marked entries
773 let should_remove = |entry: &crate::lockfile::LockedResource| {
774 entries_to_remove.contains(&(entry.name.clone(), entry.source.clone()))
775 };
776
777 lockfile.agents.retain(|entry| !should_remove(entry));
778 lockfile.snippets.retain(|entry| !should_remove(entry));
779 lockfile.commands.retain(|entry| !should_remove(entry));
780 lockfile.scripts.retain(|entry| !should_remove(entry));
781 lockfile.hooks.retain(|entry| !should_remove(entry));
782 lockfile.mcp_servers.retain(|entry| !should_remove(entry));
783 }
784
785 /// Recursively collect all transitive children of a lockfile entry.
786 ///
787 /// This walks the dependency graph starting from `parent`, following the `dependencies`
788 /// field to find all resources that transitively depend on the parent. Only dependencies
789 /// with the same source as the parent are collected (to avoid removing unrelated resources).
790 ///
791 /// The `dependencies` field contains strings in the format:
792 /// - `"resource_type/name"` for dependencies from the same source
793 /// - `"source:resource_type/name:version"` for explicit source references
794 ///
795 /// # Arguments
796 ///
797 /// * `lockfile` - The lockfile to search for dependencies
798 /// * `parent` - The parent entry whose children we want to collect
799 /// * `entries_to_remove` - Set of (name, source) pairs to populate with found children
800 fn collect_transitive_children(
801 lockfile: &crate::lockfile::LockFile,
802 parent: &crate::lockfile::LockedResource,
803 entries_to_remove: &mut std::collections::HashSet<(String, Option<String>)>,
804 ) {
805 // For each dependency declared by this parent
806 for dep_ref in &parent.dependencies {
807 // Parse dependency reference: "source:resource_type/name:version" or "resource_type/name"
808 // Examples: "repo1:snippet/utils:v1.0.0" or "agent/helper"
809 let (dep_source, dep_name) = if let Some(colon_pos) = dep_ref.find(':') {
810 // Format: "source:resource_type/name:version"
811 let source_part = &dep_ref[..colon_pos];
812 let rest = &dep_ref[colon_pos + 1..];
813 // Find the resource_type/name part (before optional :version)
814 let type_name_part = if let Some(ver_colon) = rest.rfind(':') {
815 &rest[..ver_colon]
816 } else {
817 rest
818 };
819 // Extract name from "resource_type/name"
820 if let Some(slash_pos) = type_name_part.find('/') {
821 let name = &type_name_part[slash_pos + 1..];
822 (Some(source_part.to_string()), name.to_string())
823 } else {
824 continue; // Invalid format, skip
825 }
826 } else {
827 // Format: "resource_type/name"
828 if let Some(slash_pos) = dep_ref.find('/') {
829 let name = &dep_ref[slash_pos + 1..];
830 // Inherit parent's source
831 (parent.source.clone(), name.to_string())
832 } else {
833 continue; // Invalid format, skip
834 }
835 };
836
837 // Find the dependency entry with matching name and source
838 for resource_type in crate::core::ResourceType::all() {
839 if let Some(dep_entry) = lockfile
840 .get_resources(*resource_type)
841 .iter()
842 .find(|e| e.name == dep_name && e.source == dep_source)
843 {
844 let key = (dep_entry.name.clone(), dep_entry.source.clone());
845
846 // Add to removal set and recurse (if not already processed)
847 if !entries_to_remove.contains(&key) {
848 entries_to_remove.insert(key);
849 // Recursively collect this dependency's children
850 Self::collect_transitive_children(lockfile, dep_entry, entries_to_remove);
851 }
852 }
853 }
854 }
855 }
856
857 /// Detects conflicts where multiple dependencies resolve to the same installation path.
858 ///
859 /// This method validates that no two dependencies will overwrite each other during
860 /// installation. It builds a map of all resolved `installed_at` paths and checks for
861 /// collisions across all resource types.
862 ///
863 /// # Arguments
864 ///
865 /// * `lockfile` - The lockfile containing all resolved dependencies
866 ///
867 /// # Returns
868 ///
869 /// Returns `Ok(())` if no conflicts are detected, or an error describing the conflicts.
870 ///
871 /// # Errors
872 ///
873 /// Returns an error if:
874 /// - Two or more dependencies resolve to the same `installed_at` path
875 /// - The error message lists all conflicting dependency names and the shared path
876 ///
877 /// # Examples
878 ///
879 /// ```rust,ignore
880 /// // This would cause a conflict error:
881 /// // [agents]
882 /// // v1 = { source = "repo", path = "agents/example.md", version = "v1.0" }
883 /// // v2 = { source = "repo", path = "agents/example.md", version = "v2.0" }
884 /// // Both resolve to .claude/agents/example.md
885 /// ```
886 fn detect_target_conflicts(&self, lockfile: &LockFile) -> Result<()> {
887 use std::collections::HashMap;
888
889 // Map of (installed_at path, resolved_commit) -> list of dependency names
890 // Two dependencies with the same path AND same commit are NOT a conflict
891 let mut path_map: HashMap<(String, Option<String>), Vec<String>> = HashMap::new();
892
893 // Collect all resources from lockfile
894 // Note: Hooks and MCP servers are excluded because they're configuration-only
895 // resources that are designed to share config files (.claude/settings.local.json
896 // for hooks, .mcp.json for MCP servers), not individual files that would conflict.
897 let all_resources: Vec<(&str, &LockedResource)> = lockfile
898 .agents
899 .iter()
900 .map(|r| (r.name.as_str(), r))
901 .chain(lockfile.snippets.iter().map(|r| (r.name.as_str(), r)))
902 .chain(lockfile.commands.iter().map(|r| (r.name.as_str(), r)))
903 .chain(lockfile.scripts.iter().map(|r| (r.name.as_str(), r)))
904 // Hooks and MCP servers intentionally omitted - they share config files
905 .collect();
906
907 // Build the path map with commit information
908 for (name, resource) in &all_resources {
909 let key = (resource.installed_at.clone(), resource.resolved_commit.clone());
910 path_map.entry(key).or_default().push((*name).to_string());
911 }
912
913 // Now check for actual conflicts: same path but DIFFERENT commits
914 // Group by path only to find potential conflicts
915 let mut path_only_map: HashMap<String, Vec<(&str, &LockedResource)>> = HashMap::new();
916 for (name, resource) in &all_resources {
917 path_only_map.entry(resource.installed_at.clone()).or_default().push((name, resource));
918 }
919
920 // Find conflicts (same path with different commits)
921 let mut conflicts: Vec<(String, Vec<String>)> = Vec::new();
922 for (path, resources) in path_only_map {
923 if resources.len() > 1 {
924 // Check if they have different commits
925 let commits: std::collections::HashSet<_> =
926 resources.iter().map(|(_, r)| &r.resolved_commit).collect();
927
928 // Only a conflict if different commits
929 if commits.len() > 1 {
930 let names: Vec<String> =
931 resources.iter().map(|(n, _)| (*n).to_string()).collect();
932 conflicts.push((path, names));
933 }
934 }
935 }
936
937 if !conflicts.is_empty() {
938 // Build a detailed error message
939 let mut error_msg = String::from(
940 "Target path conflicts detected:\n\n\
941 Multiple dependencies resolve to the same installation path with different content.\n\
942 This would cause files to overwrite each other.\n\n",
943 );
944
945 for (path, names) in &conflicts {
946 error_msg.push_str(&format!(
947 " Path: {}\n Conflicts: {}\n\n",
948 path,
949 names.join(", ")
950 ));
951 }
952
953 error_msg.push_str(
954 "To resolve this conflict:\n\
955 1. Use custom 'target' field to specify different installation paths:\n\
956 Example: target = \"custom/subdir/file.md\"\n\n\
957 2. Use custom 'filename' field to specify different filenames:\n\
958 Example: filename = \"utils-v2.md\"\n\n\
959 3. For transitive dependencies, add them as direct dependencies with custom target/filename\n\n\
960 4. Ensure pattern dependencies don't overlap with single-file dependencies\n\n\
961 Note: This often occurs when different dependencies have transitive dependencies\n\
962 with the same name but from different sources.",
963 );
964
965 return Err(anyhow::anyhow!(error_msg));
966 }
967
968 Ok(())
969 }
970
971 /// Pre-syncs all sources needed for the given dependencies.
972 ///
973 /// This method implements the first phase of the two-phase resolution architecture.
974 /// It should be called during the "Syncing sources" phase to perform all Git
975 /// clone/fetch operations upfront, before actual dependency resolution.
976 ///
977 /// This separation provides several benefits:
978 /// - Clear separation of network operations from version resolution logic
979 /// - Better progress reporting with distinct phases
980 /// - Enables batch processing of Git operations for efficiency
981 /// - Allows the `resolve_all()` method to work purely with local cached data
982 ///
983 /// After calling this method, the internal [`VersionResolver`] will have all
984 /// necessary source repositories cached and ready for version-to-SHA resolution.
985 ///
986 /// # Arguments
987 ///
988 /// * `deps` - A slice of tuples containing dependency names and their definitions.
989 /// Only dependencies with Git sources will be processed.
990 ///
991 /// # Example
992 ///
993 /// ```no_run
994 /// # use agpm_cli::resolver::DependencyResolver;
995 /// # use agpm_cli::manifest::{Manifest, ResourceDependency};
996 /// # use agpm_cli::cache::Cache;
997 /// # async fn example() -> anyhow::Result<()> {
998 /// let manifest = Manifest::new();
999 /// let cache = Cache::new()?;
1000 /// let mut resolver = DependencyResolver::with_cache(manifest.clone(), cache);
1001 ///
1002 /// // Get all dependencies from manifest
1003 /// let deps: Vec<(String, ResourceDependency)> = manifest
1004 /// .all_dependencies()
1005 /// .into_iter()
1006 /// .map(|(name, dep)| (name.to_string(), dep.clone()))
1007 /// .collect();
1008 ///
1009 /// // Phase 1: Pre-sync all sources (performs Git clone/fetch operations)
1010 /// resolver.pre_sync_sources(&deps).await?;
1011 ///
1012 /// // Phase 2: Now sources are ready for version resolution (no network I/O)
1013 /// let resolved = resolver.resolve().await?;
1014 /// # Ok(())
1015 /// # }
1016 /// ```
1017 ///
1018 /// # Two-Phase Resolution Pattern
1019 ///
1020 /// This method is part of AGPM's two-phase resolution architecture:
1021 ///
1022 /// 1. **Sync Phase** (`pre_sync_sources`): Clone/fetch all Git repositories
1023 /// 2. **Resolution Phase** (`resolve` or `update`): Resolve versions to SHAs locally
1024 ///
1025 /// This pattern ensures all network operations happen upfront with clear progress
1026 /// reporting, while version resolution can proceed quickly using cached data.
1027 ///
1028 /// # Errors
1029 ///
1030 /// Returns an error if:
1031 /// - Source repository cloning or fetching fails
1032 /// - Network connectivity issues occur
1033 /// - Authentication fails for private repositories
1034 /// - Source names in dependencies don't match configured sources
1035 /// - Git operations fail due to repository corruption or disk space issues
1036 pub async fn pre_sync_sources(&mut self, deps: &[(String, ResourceDependency)]) -> Result<()> {
1037 // Clear and rebuild the version resolver entries
1038 self.version_resolver.clear();
1039
1040 // Collect all unique (source, version) pairs
1041 for (_, dep) in deps {
1042 if let Some(source_name) = dep.get_source() {
1043 let source_url =
1044 self.source_manager.get_source_url(source_name).ok_or_else(|| {
1045 AgpmError::SourceNotFound {
1046 name: source_name.to_string(),
1047 }
1048 })?;
1049
1050 let version = dep.get_version();
1051
1052 // Add to version resolver for batch syncing
1053 self.version_resolver.add_version(source_name, &source_url, version);
1054 }
1055 }
1056
1057 // Pre-sync all sources (performs Git operations)
1058 self.version_resolver.pre_sync_sources().await.context("Failed to sync sources")?;
1059
1060 Ok(())
1061 }
1062
1063 /// Get available versions (tags) for a repository.
1064 ///
1065 /// Lists all tags from a Git repository, which typically represent available versions.
1066 /// This is useful for checking what versions are available for updates.
1067 ///
1068 /// # Arguments
1069 ///
1070 /// * `repo_path` - Path to the Git repository (typically in the cache directory)
1071 ///
1072 /// # Returns
1073 ///
1074 /// A vector of version strings (tag names) available in the repository.
1075 ///
1076 /// # Examples
1077 ///
1078 /// ```rust,no_run,ignore
1079 /// use agpm_cli::resolver::DependencyResolver;
1080 /// use agpm_cli::cache::Cache;
1081 ///
1082 /// # async fn example() -> anyhow::Result<()> {
1083 /// let cache = Cache::new()?;
1084 /// let repo_path = cache.get_repo_path("community");
1085 /// let resolver = DependencyResolver::new(manifest, cache, 10)?;
1086 ///
1087 /// let versions = resolver.get_available_versions(&repo_path).await?;
1088 /// for version in versions {
1089 /// println!("Available: {}", version);
1090 /// }
1091 /// # Ok(())
1092 /// # }
1093 /// ```
1094 pub async fn get_available_versions(&self, repo_path: &Path) -> Result<Vec<String>> {
1095 let repo = GitRepo::new(repo_path);
1096 repo.list_tags()
1097 .await
1098 .with_context(|| format!("Failed to list tags from repository at {repo_path:?}"))
1099 }
1100
1101 /// Creates worktrees for all resolved SHAs in parallel.
1102 ///
1103 /// This helper method is part of AGPM's SHA-based worktree architecture, processing
1104 /// all resolved versions from the [`VersionResolver`] and creating Git worktrees
1105 /// for each unique commit SHA. It leverages async concurrency to create multiple
1106 /// worktrees in parallel while maintaining proper error propagation.
1107 ///
1108 /// The method implements SHA-based deduplication - multiple versions (tags, branches)
1109 /// that resolve to the same commit SHA will share a single worktree, maximizing
1110 /// disk space efficiency and reducing clone operations.
1111 ///
1112 /// # Implementation Details
1113 ///
1114 /// 1. **Parallel Execution**: Uses `futures::future::join_all()` for concurrent worktree creation
1115 /// 2. **SHA-based Keys**: Worktrees are keyed by commit SHA rather than version strings
1116 /// 3. **Deduplication**: Multiple refs pointing to the same commit share one worktree
1117 /// 4. **Error Handling**: Fails fast if any worktree creation fails
1118 ///
1119 /// # Returns
1120 ///
1121 /// Returns a [`HashMap`] mapping repository keys (format: `"source::version"`) to
1122 /// [`PreparedSourceVersion`] structs containing:
1123 /// - `worktree_path`: Absolute path to the created worktree directory
1124 /// - `resolved_version`: The resolved Git reference (tag, branch, or SHA)
1125 /// - `resolved_commit`: The final commit SHA for the worktree
1126 ///
1127 /// # Example Usage
1128 ///
1129 /// ```ignore
1130 /// // This is called internally after version resolution
1131 /// let prepared = resolver.create_worktrees_for_resolved_versions().await?;
1132 ///
1133 /// // Access worktree for a specific dependency
1134 /// let key = DependencyResolver::group_key("my-source", "v1.0.0");
1135 /// if let Some(prepared_version) = prepared.get(&key) {
1136 /// println!("Worktree at: {}", prepared_version.worktree_path.display());
1137 /// println!("Commit SHA: {}", prepared_version.resolved_commit);
1138 /// }
1139 /// ```
1140 ///
1141 /// # Errors
1142 ///
1143 /// Returns an error if:
1144 /// - Source URL cannot be found for a resolved version (indicates configuration issue)
1145 /// - Worktree creation fails for any SHA (disk space, permissions, Git errors)
1146 /// - File system operations fail (I/O errors, permission denied)
1147 /// - Git operations fail (corrupted repository, invalid SHA)
1148 async fn create_worktrees_for_resolved_versions(
1149 &self,
1150 ) -> Result<HashMap<String, PreparedSourceVersion>> {
1151 let resolved_full = self.version_resolver.get_all_resolved_full().clone();
1152 let mut prepared_versions = HashMap::new();
1153
1154 // Build futures for parallel worktree creation
1155 let mut futures = Vec::new();
1156
1157 for ((source_name, version_key), resolved_version) in resolved_full {
1158 let sha = resolved_version.sha;
1159 let resolved_ref = resolved_version.resolved_ref;
1160 let repo_key = Self::group_key(&source_name, &version_key);
1161 let cache_clone = self.cache.clone();
1162 let source_name_clone = source_name.clone();
1163
1164 // Get the source URL for this source
1165 let source_url_clone = self
1166 .source_manager
1167 .get_source_url(&source_name)
1168 .ok_or_else(|| AgpmError::SourceNotFound {
1169 name: source_name.to_string(),
1170 })?
1171 .to_string();
1172
1173 let sha_clone = sha.clone();
1174 let resolved_ref_clone = resolved_ref.clone();
1175
1176 let future = async move {
1177 // Use SHA-based worktree creation
1178 // The version resolver has already handled fetching and SHA resolution
1179 let worktree_path = cache_clone
1180 .get_or_create_worktree_for_sha(
1181 &source_name_clone,
1182 &source_url_clone,
1183 &sha_clone,
1184 Some(&source_name_clone), // context for logging
1185 )
1186 .await?;
1187
1188 Ok::<_, anyhow::Error>((
1189 repo_key,
1190 PreparedSourceVersion {
1191 worktree_path,
1192 resolved_version: Some(resolved_ref_clone),
1193 resolved_commit: sha_clone,
1194 },
1195 ))
1196 };
1197
1198 futures.push(future);
1199 }
1200
1201 // Execute all futures concurrently and collect results
1202 let results = futures::future::join_all(futures).await;
1203
1204 // Process results and build the map
1205 for result in results {
1206 let (key, prepared) = result?;
1207 prepared_versions.insert(key, prepared);
1208 }
1209
1210 Ok(prepared_versions)
1211 }
1212
1213 async fn prepare_remote_groups(&mut self, deps: &[(String, ResourceDependency)]) -> Result<()> {
1214 self.prepared_versions.clear();
1215
1216 // Check if we need to rebuild version resolver entries
1217 // This happens when prepare_remote_groups is called without pre_sync_sources
1218 // (e.g., during tests or backward compatibility)
1219 if !self.version_resolver.has_entries() {
1220 // Rebuild entries for version resolution
1221 for (_, dep) in deps {
1222 if let Some(source_name) = dep.get_source() {
1223 let source_url =
1224 self.source_manager.get_source_url(source_name).ok_or_else(|| {
1225 AgpmError::SourceNotFound {
1226 name: source_name.to_string(),
1227 }
1228 })?;
1229
1230 let version = dep.get_version();
1231
1232 // Add to version resolver for batch resolution
1233 self.version_resolver.add_version(source_name, &source_url, version);
1234 }
1235 }
1236
1237 // If entries were rebuilt, we need to sync sources first
1238 self.version_resolver.pre_sync_sources().await.context("Failed to sync sources")?;
1239 }
1240
1241 // Now resolve all versions to SHAs
1242 self.version_resolver.resolve_all().await.context("Failed to resolve versions to SHAs")?;
1243
1244 // Step 3: Create worktrees for all resolved SHAs in parallel
1245 let prepared_versions = self.create_worktrees_for_resolved_versions().await?;
1246
1247 // Store the prepared versions
1248 self.prepared_versions.extend(prepared_versions);
1249
1250 // Step 4: Handle local sources separately (they don't need worktrees)
1251 for (_, dep) in deps {
1252 if let Some(source_name) = dep.get_source() {
1253 let source_url =
1254 self.source_manager.get_source_url(source_name).ok_or_else(|| {
1255 AgpmError::SourceNotFound {
1256 name: source_name.to_string(),
1257 }
1258 })?;
1259
1260 // Check if this is a local directory source
1261 if crate::utils::is_local_path(&source_url) {
1262 let version_key = dep.get_version().unwrap_or("HEAD");
1263 let group_key = Self::group_key(source_name, version_key);
1264
1265 // Add to prepared_versions with the local path
1266 self.prepared_versions.insert(
1267 group_key,
1268 PreparedSourceVersion {
1269 worktree_path: PathBuf::from(&source_url),
1270 resolved_version: Some("local".to_string()),
1271 resolved_commit: String::new(), // No commit for local sources
1272 },
1273 );
1274 }
1275 }
1276 }
1277
1278 // Phase completion is handled by the caller
1279
1280 Ok(())
1281 }
1282
1283 /// Creates a new resolver using only manifest-defined sources.
1284 ///
1285 /// This constructor creates a resolver that only considers sources defined
1286 /// in the manifest file. Global configuration sources from `~/.agpm/config.toml`
1287 /// are ignored, which may cause resolution failures for private repositories
1288 /// that require authentication.
1289 ///
1290 /// # Usage
1291 ///
1292 /// Use this constructor for:
1293 /// - Public repositories only
1294 /// - Testing and development
1295 /// - Backward compatibility with older workflows
1296 ///
1297 /// For production use with private repositories, prefer [`new_with_global()`].
1298 ///
1299 /// # Errors
1300 ///
1301 /// Returns an error if the cache cannot be created.
1302 ///
1303 /// [`new_with_global()`]: DependencyResolver::new_with_global
1304 pub fn new(manifest: Manifest, cache: Cache) -> Result<Self> {
1305 let source_manager = SourceManager::from_manifest(&manifest)?;
1306 let version_resolver = VersionResolver::new(cache.clone());
1307
1308 Ok(Self {
1309 manifest,
1310 source_manager,
1311 cache,
1312 prepared_versions: HashMap::new(),
1313 version_resolver,
1314 dependency_map: HashMap::new(),
1315 conflict_detector: ConflictDetector::new(),
1316 pattern_alias_map: HashMap::new(),
1317 })
1318 }
1319
1320 /// Creates a new resolver with global configuration support.
1321 ///
1322 /// This is the recommended constructor for most use cases. It loads both
1323 /// manifest sources and global sources from `~/.agpm/config.toml`, enabling
1324 /// access to private repositories with authentication tokens.
1325 ///
1326 /// # Source Priority
1327 ///
1328 /// When sources are defined in both locations:
1329 /// 1. **Global sources** (from `~/.agpm/config.toml`) are loaded first
1330 /// 2. **Local sources** (from `agpm.toml`) can override global sources
1331 ///
1332 /// This allows teams to share project configurations while keeping
1333 /// authentication tokens in user-specific global config.
1334 ///
1335 /// # Errors
1336 ///
1337 /// Returns an error if:
1338 /// - The cache cannot be created
1339 /// - The global config file exists but cannot be parsed
1340 /// - Network errors occur while validating global sources
1341 pub async fn new_with_global(manifest: Manifest, cache: Cache) -> Result<Self> {
1342 let source_manager = SourceManager::from_manifest_with_global(&manifest).await?;
1343 let version_resolver = VersionResolver::new(cache.clone());
1344
1345 Ok(Self {
1346 manifest,
1347 source_manager,
1348 cache,
1349 prepared_versions: HashMap::new(),
1350 version_resolver,
1351 dependency_map: HashMap::new(),
1352 conflict_detector: ConflictDetector::new(),
1353 pattern_alias_map: HashMap::new(),
1354 })
1355 }
1356
1357 /// Creates a new resolver with a custom cache.
1358 ///
1359 /// This constructor is primarily used for testing and specialized deployments
1360 /// where the default cache location (`~/.agpm/cache/`) is not suitable.
1361 ///
1362 /// # Use Cases
1363 ///
1364 /// - **Testing**: Isolated cache for test environments
1365 /// - **CI/CD**: Custom cache locations for build systems
1366 /// - **Containers**: Non-standard filesystem layouts
1367 /// - **Multi-user**: Shared cache directories
1368 ///
1369 /// # Note
1370 ///
1371 /// This constructor does not load global configuration. If you need both
1372 /// custom cache location and global config support, create the resolver
1373 /// with [`new_with_global()`] and manually configure the source manager.
1374 ///
1375 /// [`new_with_global()`]: DependencyResolver::new_with_global
1376 #[must_use]
1377 pub fn with_cache(manifest: Manifest, cache: Cache) -> Self {
1378 let cache_dir = cache.get_cache_location().to_path_buf();
1379 let source_manager = SourceManager::from_manifest_with_cache(&manifest, cache_dir);
1380 let version_resolver = VersionResolver::new(cache.clone());
1381
1382 Self {
1383 manifest,
1384 source_manager,
1385 cache,
1386 prepared_versions: HashMap::new(),
1387 version_resolver,
1388 dependency_map: HashMap::new(),
1389 conflict_detector: ConflictDetector::new(),
1390 pattern_alias_map: HashMap::new(),
1391 }
1392 }
1393
1394 /// Resolves all dependencies and generates a complete lockfile.
1395 ///
1396 /// This is the main resolution method that processes all dependencies from
1397 /// the manifest and produces a deterministic lockfile. The process includes:
1398 ///
1399 /// 1. **Source Validation**: Verify all referenced sources exist
1400 /// 2. **Parallel Sync**: Clone/update source repositories concurrently
1401 /// 3. **Version Resolution**: Resolve constraints to specific commits
1402 /// 4. **Entry Generation**: Create lockfile entries with checksums
1403 ///
1404 /// # Algorithm Details
1405 ///
1406 /// The resolution algorithm processes dependencies in dependency order to ensure
1407 /// consistency. For each dependency:
1408 /// - Local dependencies are processed immediately (no network access)
1409 /// - Remote dependencies trigger source synchronization
1410 /// - Version constraints are resolved using Git tag/branch lookup
1411 /// - Installation paths are determined based on resource type
1412 ///
1413 /// # Parameters
1414 ///
1415 /// - `progress`: Optional progress manager for user feedback during long operations
1416 ///
1417 /// # Returns
1418 ///
1419 /// A [`LockFile`] containing all resolved dependencies with:
1420 /// - Exact commit hashes for reproducible installations
1421 /// - Source URLs for traceability
1422 /// - Installation paths for each resource
1423 /// - Checksums for integrity verification (computed later during installation)
1424 ///
1425 /// # Errors
1426 ///
1427 /// Resolution can fail due to:
1428 /// - **Network Issues**: Git clone/fetch failures
1429 /// - **Authentication**: Missing or invalid credentials for private sources
1430 /// - **Version Conflicts**: Incompatible version constraints
1431 /// - **Missing Resources**: Referenced files don't exist in sources
1432 /// - **Path Conflicts**: Multiple resources installing to same location
1433 ///
1434 /// # Performance
1435 ///
1436 /// - **Parallel Source Sync**: Multiple sources are processed concurrently
1437 /// - **Cache Utilization**: Previously cloned sources are reused
1438 /// - **Progress Reporting**: Non-blocking UI updates during resolution
1439 ///
1440 /// [`LockFile`]: crate::lockfile::LockFile
1441 ///
1442 /// Resolve transitive dependencies by extracting metadata from resource files.
1443 ///
1444 /// This method builds a dependency graph by:
1445 /// 1. Starting with direct manifest dependencies
1446 /// 2. Extracting metadata from each resolved resource
1447 /// 3. Adding discovered transitive dependencies to the graph
1448 /// 4. Checking for circular dependencies
1449 /// 5. Returning all dependencies in topological order
1450 ///
1451 /// # Cross-Source Handling
1452 ///
1453 /// Resources are uniquely identified by `(ResourceType, name, source)`, allowing multiple
1454 /// resources with the same name from different sources to coexist. During topological
1455 /// ordering, all resources with matching name and type are included, even if they come
1456 /// from different sources.
1457 ///
1458 /// # Arguments
1459 /// * `base_deps` - The initial dependencies from the manifest
1460 /// * `enable_transitive` - Whether to resolve transitive dependencies
1461 ///
1462 /// # Returns
1463 /// A vector of all dependencies (direct + transitive) in topological order, including
1464 /// all same-named resources from different sources. Each tuple contains:
1465 /// - Dependency name
1466 /// - Resource dependency details
1467 /// - Resource type (to avoid ambiguity when same name exists across types)
1468 async fn resolve_transitive_dependencies(
1469 &mut self,
1470 base_deps: &[(String, ResourceDependency, crate::core::ResourceType)],
1471 enable_transitive: bool,
1472 ) -> Result<Vec<(String, ResourceDependency, crate::core::ResourceType)>> {
1473 // Clear state from any previous resolution to prevent stale data
1474 // IMPORTANT: Must clear before early return to avoid contaminating non-transitive runs
1475 self.dependency_map.clear();
1476 // NOTE: Don't reset conflict_detector here - it was already populated with direct dependencies
1477
1478 if !enable_transitive {
1479 // If transitive resolution is disabled, return base dependencies as-is
1480 // with their resource types already threaded from the manifest
1481 return Ok(base_deps.to_vec());
1482 }
1483
1484 let mut graph = DependencyGraph::new();
1485 // Use (resource_type, name, source, tool) as key to distinguish same-named resources from different sources and tools
1486 let mut all_deps: HashMap<
1487 (crate::core::ResourceType, String, Option<String>, Option<String>),
1488 ResourceDependency,
1489 > = HashMap::new();
1490 let mut processed: HashSet<(
1491 crate::core::ResourceType,
1492 String,
1493 Option<String>,
1494 Option<String>,
1495 )> = HashSet::new();
1496 let mut queue: Vec<(String, ResourceDependency, Option<crate::core::ResourceType>)> =
1497 Vec::new();
1498
1499 // Add initial dependencies to queue with their threaded types
1500 for (name, dep, resource_type) in base_deps {
1501 let source = dep.get_source().map(std::string::ToString::to_string);
1502 let tool = dep.get_tool().map(std::string::ToString::to_string);
1503 queue.push((name.clone(), dep.clone(), Some(*resource_type)));
1504 all_deps.insert((*resource_type, name.clone(), source, tool), dep.clone());
1505 }
1506
1507 // Process queue to discover transitive dependencies
1508 while let Some((name, dep, resource_type)) = queue.pop() {
1509 let source = dep.get_source().map(std::string::ToString::to_string);
1510 let tool = dep.get_tool().map(std::string::ToString::to_string);
1511 let resource_type =
1512 resource_type.expect("resource_type should always be threaded through queue");
1513 let key = (resource_type, name.clone(), source.clone(), tool.clone());
1514
1515 // Check if this queue entry is stale (superseded by conflict resolution)
1516 // IMPORTANT: This must come BEFORE the processed check so that conflict-resolved
1517 // entries can be reprocessed even if an older version was already processed.
1518 // If all_deps has a different version than what we popped, skip this entry
1519 if let Some(current_dep) = all_deps.get(&key) {
1520 if current_dep.get_version() != dep.get_version() {
1521 // This entry was superseded by conflict resolution, skip it
1522 continue;
1523 }
1524 }
1525
1526 if processed.contains(&key) {
1527 continue;
1528 }
1529
1530 processed.insert(key.clone());
1531
1532 // Handle pattern dependencies by expanding them to concrete files
1533 if dep.is_pattern() {
1534 // Expand the pattern to get all matching files
1535 match self.expand_pattern_to_concrete_deps(&name, &dep, resource_type).await {
1536 Ok(concrete_deps) => {
1537 // Queue each concrete dependency for transitive resolution
1538 for (concrete_name, concrete_dep) in concrete_deps {
1539 // Record the mapping from concrete resource name to pattern alias
1540 // This enables patches defined under the pattern alias to be applied to all matched resources
1541 // Key includes ResourceType to prevent collisions between different resource types
1542 self.pattern_alias_map
1543 .insert((resource_type, concrete_name.clone()), name.clone());
1544
1545 let concrete_source =
1546 concrete_dep.get_source().map(std::string::ToString::to_string);
1547 let concrete_tool =
1548 concrete_dep.get_tool().map(std::string::ToString::to_string);
1549 let concrete_key = (
1550 resource_type,
1551 concrete_name.clone(),
1552 concrete_source,
1553 concrete_tool,
1554 );
1555
1556 // Only add if not already processed or queued
1557 if let std::collections::hash_map::Entry::Vacant(e) =
1558 all_deps.entry(concrete_key)
1559 {
1560 e.insert(concrete_dep.clone());
1561 queue.push((concrete_name, concrete_dep, Some(resource_type)));
1562 }
1563 }
1564 }
1565 Err(e) => {
1566 anyhow::bail!(
1567 "Failed to expand pattern '{}' for transitive dependency extraction: {}",
1568 dep.get_path(),
1569 e
1570 );
1571 }
1572 }
1573 continue; // Skip to next queue item
1574 }
1575
1576 // Get the resource content to extract metadata
1577 let content = self.fetch_resource_content(&name, &dep).await.with_context(|| {
1578 format!("Failed to fetch resource '{name}' for transitive dependency extraction")
1579 })?;
1580
1581 // Extract metadata from the resource
1582 let path = PathBuf::from(dep.get_path());
1583 let metadata =
1584 MetadataExtractor::extract(&path, &content, self.manifest.project.as_ref())?;
1585
1586 // Process transitive dependencies if present
1587 if let Some(deps_map) = metadata.dependencies {
1588 tracing::debug!(
1589 "Processing transitive deps for: {} (has source: {:?})",
1590 name,
1591 dep.get_source()
1592 );
1593
1594 for (dep_resource_type_str, dep_specs) in deps_map {
1595 // Convert plural form from YAML (e.g., "agents") to ResourceType enum
1596 // The ResourceType::FromStr accepts both plural and singular forms
1597 let dep_resource_type: crate::core::ResourceType =
1598 dep_resource_type_str.parse().unwrap_or(crate::core::ResourceType::Snippet);
1599
1600 for dep_spec in dep_specs {
1601 // UNIFIED APPROACH: File-relative path resolution for all transitive dependencies
1602
1603 // Get the canonical path to the parent resource file
1604 let parent_file_path = self
1605 .get_canonical_path_for_dependency(&dep)
1606 .await
1607 .with_context(|| {
1608 format!(
1609 "Failed to get parent path for transitive dependencies of '{}'",
1610 name
1611 )
1612 })?;
1613
1614 // Check if this is a glob pattern
1615 let is_pattern = dep_spec.path.contains('*')
1616 || dep_spec.path.contains('?')
1617 || dep_spec.path.contains('[');
1618
1619 let trans_canonical = if is_pattern {
1620 // For patterns, normalize (resolve .. and .) but don't canonicalize
1621 let parent_dir = parent_file_path.parent()
1622 .ok_or_else(|| anyhow::anyhow!(
1623 "Failed to resolve transitive dependency '{}' for '{}': parent file has no directory",
1624 dep_spec.path, name
1625 ))?;
1626 let resolved = parent_dir.join(&dep_spec.path);
1627 // IMPORTANT: Preserve the root component when normalizing
1628 let mut result = PathBuf::new();
1629 for component in resolved.components() {
1630 match component {
1631 std::path::Component::RootDir => {
1632 result.push(component);
1633 } // Preserve root!
1634 std::path::Component::ParentDir => {
1635 result.pop();
1636 }
1637 std::path::Component::CurDir => {}
1638 _ => {
1639 result.push(component);
1640 }
1641 }
1642 }
1643 result
1644 } else {
1645 // For regular paths, fully resolve and canonicalize
1646 crate::utils::resolve_file_relative_path(
1647 &parent_file_path,
1648 &dep_spec.path,
1649 )
1650 .with_context(|| {
1651 format!(
1652 "Failed to resolve transitive dependency '{}' for '{}'",
1653 dep_spec.path, name
1654 )
1655 })?
1656 };
1657
1658 // Create the transitive dependency based on whether parent is Git or path-only
1659 use crate::manifest::DetailedDependency;
1660 let trans_dep = if dep.get_source().is_none() {
1661 // Path-only transitive dep (parent is path-only)
1662 // The path was resolved relative to the parent file (file-relative resolution)
1663 // CRITICAL: Always store as manifest-relative path (even with ../) for lockfile portability
1664 // Absolute paths in lockfiles break cross-machine sharing
1665
1666 let manifest_dir = self.manifest.manifest_dir.as_ref()
1667 .ok_or_else(|| anyhow::anyhow!("Manifest directory not available for path-only transitive dep"))?;
1668
1669 // Always compute relative path from manifest to target
1670 // This handles both inside (agents/foo.md) and outside (../shared/utils.md) cases
1671 let dep_path_str = match manifest_dir.canonicalize() {
1672 Ok(canonical_manifest) => {
1673 // Normal case: both paths are canonical, compute relative path
1674 crate::utils::compute_relative_path(
1675 &canonical_manifest,
1676 &trans_canonical,
1677 )
1678 }
1679 Err(e) => {
1680 // Canonicalization failed (broken symlink, permissions, etc.)
1681 // We MUST still produce a relative path for lockfile portability.
1682 // Use the non-canonical manifest_dir - not ideal but better than absolute paths.
1683 eprintln!(
1684 "Warning: Could not canonicalize manifest directory {}: {}. Using non-canonical path for relative path computation.",
1685 manifest_dir.display(),
1686 e
1687 );
1688 crate::utils::compute_relative_path(
1689 manifest_dir,
1690 &trans_canonical,
1691 )
1692 }
1693 };
1694
1695 // Determine tool for transitive dependency
1696 let trans_tool = if let Some(explicit_tool) = &dep_spec.tool {
1697 Some(explicit_tool.clone())
1698 } else {
1699 let parent_tool =
1700 dep.get_tool().map(str::to_string).unwrap_or_else(|| {
1701 self.manifest.get_default_tool(resource_type)
1702 });
1703 if self
1704 .manifest
1705 .is_resource_supported(&parent_tool, dep_resource_type)
1706 {
1707 Some(parent_tool)
1708 } else {
1709 Some(self.manifest.get_default_tool(dep_resource_type))
1710 }
1711 };
1712
1713 ResourceDependency::Detailed(Box::new(DetailedDependency {
1714 source: None,
1715 path: crate::utils::normalize_path_for_storage(dep_path_str),
1716 version: None,
1717 branch: None,
1718 rev: None,
1719 command: None,
1720 args: None,
1721 target: None,
1722 filename: None,
1723 dependencies: None,
1724 tool: trans_tool,
1725 flatten: None,
1726 }))
1727 } else {
1728 // Git-backed transitive dep (parent is Git-backed)
1729 // The resolved path is within the worktree - need to convert back to repo-relative
1730 let source_name = dep.get_source().ok_or_else(|| {
1731 anyhow::anyhow!("Expected source for Git-backed dependency")
1732 })?;
1733 let version = dep.get_version().unwrap_or("main").to_string();
1734 let source_url =
1735 self.source_manager.get_source_url(source_name).ok_or_else(
1736 || anyhow::anyhow!("Source '{source_name}' not found"),
1737 )?;
1738
1739 // Get repo-relative path by stripping the appropriate prefix
1740 let repo_relative = if crate::utils::is_local_path(&source_url) {
1741 // For local directory sources, strip the source path to get relative path
1742 let source_path = PathBuf::from(&source_url).canonicalize()?;
1743 trans_canonical.strip_prefix(&source_path)
1744 .with_context(|| format!(
1745 "Transitive dep resolved outside parent's source directory: {} not under {}",
1746 trans_canonical.display(),
1747 source_path.display()
1748 ))?
1749 .to_path_buf()
1750 } else {
1751 // For Git sources, get worktree and strip it
1752 let sha = self
1753 .prepared_versions
1754 .get(&Self::group_key(source_name, &version))
1755 .ok_or_else(|| {
1756 anyhow::anyhow!(
1757 "Parent version not resolved for {}",
1758 source_name
1759 )
1760 })?
1761 .resolved_commit
1762 .clone();
1763 let worktree_path = self
1764 .cache
1765 .get_or_create_worktree_for_sha(
1766 source_name,
1767 &source_url,
1768 &sha,
1769 None,
1770 )
1771 .await?;
1772
1773 // Canonicalize worktree path to handle symlinks (e.g., /var -> /private/var on macOS)
1774 // and ensure consistent path formats on Windows (\\?\ prefix)
1775 let canonical_worktree =
1776 worktree_path.canonicalize().with_context(|| {
1777 format!(
1778 "Failed to canonicalize worktree path: {}",
1779 worktree_path.display()
1780 )
1781 })?;
1782
1783 trans_canonical.strip_prefix(&canonical_worktree)
1784 .with_context(|| format!(
1785 "Transitive dep resolved outside parent's worktree: {} not under {}",
1786 trans_canonical.display(),
1787 canonical_worktree.display()
1788 ))?
1789 .to_path_buf()
1790 };
1791
1792 // Determine tool for transitive dependency
1793 let trans_tool = if let Some(explicit_tool) = &dep_spec.tool {
1794 Some(explicit_tool.clone())
1795 } else {
1796 let parent_tool =
1797 dep.get_tool().map(str::to_string).unwrap_or_else(|| {
1798 self.manifest.get_default_tool(resource_type)
1799 });
1800 if self
1801 .manifest
1802 .is_resource_supported(&parent_tool, dep_resource_type)
1803 {
1804 Some(parent_tool)
1805 } else {
1806 Some(self.manifest.get_default_tool(dep_resource_type))
1807 }
1808 };
1809
1810 ResourceDependency::Detailed(Box::new(DetailedDependency {
1811 source: Some(source_name.to_string()),
1812 path: crate::utils::normalize_path_for_storage(
1813 repo_relative.to_string_lossy().to_string(),
1814 ),
1815 version: dep_spec
1816 .version
1817 .clone()
1818 .or_else(|| dep.get_version().map(|v| v.to_string())),
1819 branch: None,
1820 rev: None,
1821 command: None,
1822 args: None,
1823 target: None,
1824 filename: None,
1825 dependencies: None,
1826 tool: trans_tool,
1827 flatten: None,
1828 }))
1829 };
1830
1831 // Generate a name for the transitive dependency
1832 // Use explicit name from DependencySpec if provided, otherwise derive from path
1833 let trans_name = dep_spec
1834 .name
1835 .clone()
1836 .unwrap_or_else(|| self.generate_dependency_name(trans_dep.get_path()));
1837
1838 // Add to graph (use source-aware nodes to prevent false cycles)
1839 let trans_source =
1840 trans_dep.get_source().map(std::string::ToString::to_string);
1841 let trans_tool = trans_dep.get_tool().map(std::string::ToString::to_string);
1842 let from_node =
1843 DependencyNode::with_source(resource_type, &name, source.clone());
1844 let to_node = DependencyNode::with_source(
1845 dep_resource_type,
1846 &trans_name,
1847 trans_source.clone(),
1848 );
1849 graph.add_dependency(from_node.clone(), to_node.clone());
1850
1851 // Track in dependency map (use singular form from enum for dependency references)
1852 // Include source to prevent cross-source contamination
1853 let from_key = (resource_type, name.clone(), source.clone(), tool.clone());
1854 let dep_ref = format!("{dep_resource_type}/{trans_name}");
1855 self.dependency_map.entry(from_key).or_default().push(dep_ref);
1856
1857 // Add to conflict detector for tracking version requirements
1858 self.add_to_conflict_detector(&trans_name, &trans_dep, &name);
1859
1860 // Check for version conflicts and resolve them
1861 let trans_key = (
1862 dep_resource_type,
1863 trans_name.clone(),
1864 trans_source.clone(),
1865 trans_tool.clone(),
1866 );
1867
1868 if let Some(existing_dep) = all_deps.get(&trans_key) {
1869 // Version conflict detected (same name and source, different version)
1870 let resolved_dep = self
1871 .resolve_version_conflict(
1872 &trans_name,
1873 existing_dep,
1874 &trans_dep,
1875 &name, // Who requires this version
1876 )
1877 .await?;
1878
1879 // Only re-queue if the resolved version differs from existing
1880 let needs_reprocess =
1881 resolved_dep.get_version() != existing_dep.get_version();
1882
1883 all_deps.insert(trans_key.clone(), resolved_dep.clone());
1884
1885 if needs_reprocess {
1886 // Remove from processed so the resolved version can be processed
1887 // This ensures we fetch metadata for the correct version
1888 processed.remove(&trans_key);
1889 // Re-queue the resolved dependency
1890 queue.push((
1891 trans_name.clone(),
1892 resolved_dep,
1893 Some(dep_resource_type),
1894 ));
1895 }
1896 } else {
1897 // No conflict, add the dependency
1898 tracing::debug!(
1899 "Adding transitive dep '{}' to all_deps and queue (parent: {})",
1900 trans_name,
1901 name
1902 );
1903 all_deps.insert(trans_key.clone(), trans_dep.clone());
1904 queue.push((trans_name, trans_dep, Some(dep_resource_type)));
1905 }
1906 }
1907 }
1908 }
1909 }
1910
1911 // Check for circular dependencies
1912 graph.detect_cycles()?;
1913
1914 // Get topological order for dependencies that have relationships
1915 let ordered_nodes = graph.topological_order()?;
1916
1917 // Build result: start with topologically ordered dependencies
1918 let mut result = Vec::new();
1919 let mut added_keys = HashSet::new();
1920
1921 tracing::debug!(
1922 "Transitive resolution - topological order has {} nodes, all_deps has {} entries",
1923 ordered_nodes.len(),
1924 all_deps.len()
1925 );
1926
1927 for node in ordered_nodes {
1928 tracing::debug!(
1929 "Processing ordered node: {}/{} (source: {:?})",
1930 node.resource_type,
1931 node.name,
1932 node.source
1933 );
1934 // Find matching dependency - now that nodes include source, we can match precisely
1935 for (key, dep) in &all_deps {
1936 if key.0 == node.resource_type && key.1 == node.name && key.2 == node.source {
1937 tracing::debug!(
1938 " -> Found match in all_deps, adding to result with type {:?}",
1939 node.resource_type
1940 );
1941 result.push((node.name.clone(), dep.clone(), node.resource_type));
1942 added_keys.insert(key.clone());
1943 break; // Exact match found, no need to continue
1944 }
1945 }
1946 }
1947
1948 // Add remaining dependencies that weren't in the graph (no transitive deps)
1949 // These can be added in any order since they have no dependencies
1950 // IMPORTANT: Filter out patterns - they should only serve as expansion points,
1951 // not final dependencies. The concrete deps from expansion are what we want.
1952 for (key, dep) in all_deps {
1953 if !added_keys.contains(&key) {
1954 // Skip pattern dependencies - they were expanded to concrete deps
1955 if dep.is_pattern() {
1956 tracing::debug!(
1957 "Skipping pattern dependency in final result: {}/{} (source: {:?})",
1958 key.0,
1959 key.1,
1960 key.2
1961 );
1962 continue;
1963 }
1964
1965 tracing::debug!(
1966 "Adding non-graph dependency: {}/{} (source: {:?}) with type {:?}",
1967 key.0,
1968 key.1,
1969 key.2,
1970 key.0
1971 );
1972 result.push((key.1.clone(), dep.clone(), key.0));
1973 }
1974 }
1975
1976 tracing::debug!("Transitive resolution returning {} dependencies", result.len());
1977
1978 Ok(result)
1979 }
1980
1981 /// Fetch the content of a resource for metadata extraction.
1982 async fn fetch_resource_content(
1983 &mut self,
1984 _name: &str,
1985 dep: &ResourceDependency,
1986 ) -> Result<String> {
1987 match dep {
1988 ResourceDependency::Simple(path) => {
1989 // Local file - resolve relative to manifest directory
1990 let manifest_dir = self.manifest.manifest_dir.as_ref().ok_or_else(|| {
1991 anyhow::anyhow!("Manifest directory not available for Simple dependency")
1992 })?;
1993 let full_path =
1994 crate::utils::resolve_path_relative_to_manifest(manifest_dir, path)?;
1995 std::fs::read_to_string(&full_path)
1996 .with_context(|| format!("Failed to read local file: {}", full_path.display()))
1997 }
1998 ResourceDependency::Detailed(detailed) => {
1999 if let Some(source_name) = &detailed.source {
2000 let source_url = self
2001 .source_manager
2002 .get_source_url(source_name)
2003 .ok_or_else(|| anyhow::anyhow!("Source '{source_name}' not found"))?;
2004
2005 // Check if this is a local directory source
2006 if crate::utils::is_local_path(&source_url) {
2007 // Local directory source - read directly from path
2008 let file_path = PathBuf::from(&source_url).join(&detailed.path);
2009 std::fs::read_to_string(&file_path).with_context(|| {
2010 format!("Failed to read local file: {}", file_path.display())
2011 })
2012 } else {
2013 // Git-based remote dependency - need to checkout and read
2014 // Use get_version() to respect rev > branch > version precedence
2015 let version = dep.get_version().unwrap_or("main").to_string();
2016
2017 // Check if we already have this version resolved
2018 let sha = if let Some(prepared) =
2019 self.prepared_versions.get(&Self::group_key(source_name, &version))
2020 {
2021 prepared.resolved_commit.clone()
2022 } else {
2023 // Need to resolve this version
2024 self.version_resolver.add_version(
2025 source_name,
2026 &source_url,
2027 Some(&version),
2028 );
2029 self.version_resolver.resolve_all().await?;
2030
2031 let resolved_sha = self
2032 .version_resolver
2033 .get_resolved_sha(source_name, &version)
2034 .ok_or_else(|| {
2035 anyhow::anyhow!(
2036 "Failed to resolve version for {source_name} @ {version}"
2037 )
2038 })?;
2039
2040 // Cache the resolved version for nested transitive dependency resolution
2041 // Note: worktree_path will be set when get_or_create_worktree_for_sha is called below
2042 self.prepared_versions.insert(
2043 Self::group_key(source_name, &version),
2044 PreparedSourceVersion {
2045 worktree_path: PathBuf::new(), // Placeholder, will be set after worktree creation
2046 resolved_version: Some(version.clone()),
2047 resolved_commit: resolved_sha.clone(),
2048 },
2049 );
2050
2051 resolved_sha
2052 };
2053
2054 // Get worktree for this SHA
2055 let worktree_path = self
2056 .cache
2057 .get_or_create_worktree_for_sha(source_name, &source_url, &sha, None)
2058 .await?;
2059
2060 // Read the file from worktree (with cache coherency retry)
2061 let file_path = worktree_path.join(&detailed.path);
2062 read_with_cache_retry_sync(&file_path)
2063 }
2064 } else {
2065 // Local dependency with detailed spec - resolve relative to manifest directory
2066 let manifest_dir = self.manifest.manifest_dir.as_ref().ok_or_else(|| {
2067 anyhow::anyhow!(
2068 "Manifest directory not available for local Detailed dependency"
2069 )
2070 })?;
2071 let full_path = crate::utils::resolve_path_relative_to_manifest(
2072 manifest_dir,
2073 &detailed.path,
2074 )?;
2075 std::fs::read_to_string(&full_path).with_context(|| {
2076 format!("Failed to read local file: {}", full_path.display())
2077 })
2078 }
2079 }
2080 }
2081 }
2082
2083 /// Gets the canonical file path for a dependency (unified for Git and path-only).
2084 ///
2085 /// For path-only deps: resolves from manifest directory
2086 /// For Git-backed deps: resolves from worktree path
2087 async fn get_canonical_path_for_dependency(
2088 &mut self,
2089 dep: &ResourceDependency,
2090 ) -> Result<PathBuf> {
2091 if dep.get_source().is_none() {
2092 // Path-only: resolve from manifest directory
2093 let manifest_dir = self
2094 .manifest
2095 .manifest_dir
2096 .as_ref()
2097 .ok_or_else(|| anyhow::anyhow!("Manifest directory not available"))?;
2098 crate::utils::resolve_path_relative_to_manifest(manifest_dir, dep.get_path())
2099 } else {
2100 // Git-backed: get worktree path and join with repo-relative path
2101 let source_name = dep
2102 .get_source()
2103 .ok_or_else(|| anyhow::anyhow!("Cannot get worktree for path-only dependency"))?;
2104 let version = dep.get_version().unwrap_or("main").to_string();
2105
2106 // Get source URL
2107 let source_url = self
2108 .source_manager
2109 .get_source_url(source_name)
2110 .ok_or_else(|| anyhow::anyhow!("Source '{source_name}' not found"))?;
2111
2112 // Check if this is a local directory source (not a Git repo)
2113 if crate::utils::is_local_path(&source_url) {
2114 // Local directory source - resolve directly from source path
2115 let file_path = PathBuf::from(&source_url).join(dep.get_path());
2116 file_path.canonicalize().with_context(|| {
2117 format!("Failed to canonicalize local source resource: {}", file_path.display())
2118 })
2119 } else {
2120 // Git-backed: resolve from worktree
2121 // Get the resolved SHA
2122 let sha = if let Some(prepared) =
2123 self.prepared_versions.get(&Self::group_key(source_name, &version))
2124 {
2125 prepared.resolved_commit.clone()
2126 } else {
2127 // Need to resolve this version
2128 self.version_resolver.add_version(source_name, &source_url, Some(&version));
2129 self.version_resolver.resolve_all().await?;
2130
2131 self.version_resolver.get_resolved_sha(source_name, &version).ok_or_else(
2132 || {
2133 anyhow::anyhow!(
2134 "Failed to resolve version for {source_name} @ {version}"
2135 )
2136 },
2137 )?
2138 };
2139
2140 // Get worktree path
2141 let worktree_path = self
2142 .cache
2143 .get_or_create_worktree_for_sha(source_name, &source_url, &sha, None)
2144 .await?;
2145
2146 // Join with repo-relative path and canonicalize
2147 let full_path = worktree_path.join(dep.get_path());
2148 full_path.canonicalize().with_context(|| {
2149 format!("Failed to canonicalize Git resource: {}", full_path.display())
2150 })
2151 }
2152 }
2153 }
2154
2155 /// Expands a pattern dependency to concrete (non-pattern) dependencies.
2156 ///
2157 /// This method is used during transitive dependency resolution to handle
2158 /// glob patterns declared in resource frontmatter. It expands the pattern
2159 /// to all matching files and creates a concrete ResourceDependency for each.
2160 ///
2161 /// # Arguments
2162 ///
2163 /// * `name` - The dependency name (used for logging)
2164 /// * `dep` - The pattern-based dependency to expand
2165 /// * `resource_type` - The resource type (for path construction)
2166 ///
2167 /// # Returns
2168 ///
2169 /// A vector of (name, ResourceDependency) tuples for each matched file.
2170 async fn expand_pattern_to_concrete_deps(
2171 &self,
2172 _name: &str,
2173 dep: &ResourceDependency,
2174 _resource_type: crate::core::ResourceType,
2175 ) -> Result<Vec<(String, ResourceDependency)>> {
2176 use crate::manifest::DetailedDependency;
2177
2178 let pattern = dep.get_path();
2179
2180 if dep.is_local() {
2181 // Local pattern dependency - search in filesystem
2182 // For absolute patterns, use the parent directory as base and strip the pattern to just the filename part
2183 // For relative patterns, use current directory
2184 let pattern_path = Path::new(pattern);
2185 let (base_path, search_pattern) = if pattern_path.is_absolute() {
2186 // Absolute pattern: extract base directory and relative pattern
2187 // Example: "/tmp/xyz/agents/*.md" -> base="/tmp/xyz", pattern="agents/*.md"
2188 let components: Vec<_> = pattern_path.components().collect();
2189
2190 // Find the first component with a glob character
2191 let glob_idx = components.iter().position(|c| {
2192 let s = c.as_os_str().to_string_lossy();
2193 s.contains('*') || s.contains('?') || s.contains('[')
2194 });
2195
2196 if let Some(idx) = glob_idx {
2197 // Split at the glob component
2198 let base_components = &components[..idx];
2199 let pattern_components = &components[idx..];
2200
2201 let base: PathBuf = base_components.iter().collect();
2202 let pattern: String = pattern_components
2203 .iter()
2204 .map(|c| c.as_os_str().to_string_lossy())
2205 .collect::<Vec<_>>()
2206 .join("/");
2207
2208 (base, pattern)
2209 } else {
2210 // No glob characters, use as-is
2211 (PathBuf::from("."), pattern.to_string())
2212 }
2213 } else {
2214 // Relative pattern, use current directory
2215 (PathBuf::from("."), pattern.to_string())
2216 };
2217
2218 let pattern_resolver = crate::pattern::PatternResolver::new();
2219 let matches = pattern_resolver.resolve(&search_pattern, &base_path)?;
2220
2221 // Get tool, target, and flatten from parent pattern dependency
2222 let (tool, target, flatten) = match dep {
2223 ResourceDependency::Detailed(d) => (d.tool.clone(), d.target.clone(), d.flatten),
2224 _ => (None, None, None),
2225 };
2226
2227 let mut concrete_deps = Vec::new();
2228 for matched_path in matches {
2229 let resource_name = crate::pattern::extract_resource_name(&matched_path);
2230 // Convert matched path to absolute by joining with base_path
2231 let absolute_path = base_path.join(&matched_path);
2232 let concrete_path = absolute_path.to_string_lossy().to_string();
2233
2234 // Create a non-pattern Detailed dependency, inheriting tool, target, and flatten from parent
2235 concrete_deps.push((
2236 resource_name,
2237 ResourceDependency::Detailed(Box::new(DetailedDependency {
2238 source: None,
2239 path: concrete_path,
2240 version: None,
2241 branch: None,
2242 rev: None,
2243 command: None,
2244 args: None,
2245 target: target.clone(),
2246 filename: None,
2247 dependencies: None,
2248 tool: tool.clone(),
2249 flatten,
2250 })),
2251 ));
2252 }
2253
2254 Ok(concrete_deps)
2255 } else {
2256 // Git-based remote dependency - need worktree
2257 let source_name = dep
2258 .get_source()
2259 .ok_or_else(|| anyhow::anyhow!("Pattern dependency missing source"))?;
2260
2261 // Get the prepared worktree for this source/version
2262 let version_key = dep
2263 .get_version()
2264 .map_or_else(|| "HEAD".to_string(), std::string::ToString::to_string);
2265 let prepared_key = Self::group_key(source_name, &version_key);
2266
2267 let prepared = self.prepared_versions.get(&prepared_key).ok_or_else(|| {
2268 anyhow::anyhow!(
2269 "Prepared state missing for source '{}' @ '{}'. Pattern expansion requires prepared worktree.",
2270 source_name,
2271 version_key
2272 )
2273 })?;
2274
2275 let repo_path = &prepared.worktree_path;
2276
2277 // Search for matching files in the repository
2278 let pattern_resolver = crate::pattern::PatternResolver::new();
2279 let matches = pattern_resolver.resolve(pattern, Path::new(repo_path))?;
2280
2281 // Create concrete dependencies for each match
2282 let mut concrete_deps = Vec::new();
2283 for matched_path in matches {
2284 let resource_name = crate::pattern::extract_resource_name(&matched_path);
2285
2286 // Use the matched path as-is (it's already relative to repo root)
2287 // The matched path includes the resource type directory (e.g., "snippets/helper-one.md")
2288 let concrete_path = matched_path.to_string_lossy().to_string();
2289
2290 // Get tool, target, and flatten from parent
2291 let (tool, target, flatten) = match dep {
2292 ResourceDependency::Detailed(d) => {
2293 (d.tool.clone(), d.target.clone(), d.flatten)
2294 }
2295 _ => (None, None, None),
2296 };
2297
2298 // Create a non-pattern Detailed dependency, inheriting tool, target, and flatten from parent
2299 concrete_deps.push((
2300 resource_name,
2301 ResourceDependency::Detailed(Box::new(DetailedDependency {
2302 source: Some(source_name.to_string()),
2303 path: concrete_path,
2304 version: dep.get_version().map(std::string::ToString::to_string),
2305 branch: None,
2306 rev: None,
2307 command: None,
2308 args: None,
2309 target, // Inherit custom target from parent pattern
2310 filename: None,
2311 dependencies: None,
2312 tool,
2313 flatten,
2314 })),
2315 ));
2316 }
2317
2318 Ok(concrete_deps)
2319 }
2320 }
2321
2322 /// Generate a dependency name from a path.
2323 /// Creates collision-resistant names by preserving directory structure.
2324 fn generate_dependency_name(&self, path: &str) -> String {
2325 // Convert path to a collision-resistant name
2326 // Example: "agents/ai/helper.md" -> "ai/helper"
2327 // Example: "snippets/commands/commit.md" -> "commands/commit"
2328 // Example: "commit.md" -> "commit"
2329 // Example (absolute): "/private/tmp/shared/snippets/utils.md" -> "/private/tmp/shared/snippets/utils"
2330 // Example (Windows absolute): "C:/team/tools/foo.md" -> "C:/team/tools/foo"
2331 // Example (parent-relative): "../shared/utils.md" -> "../shared/utils"
2332
2333 let path = Path::new(path);
2334
2335 // Get the path without extension
2336 let without_ext = path.with_extension("");
2337
2338 // Convert to string and normalize separators to forward slashes
2339 // This ensures consistent behavior on Windows where Path::to_string_lossy()
2340 // produces backslashes, which would break our split('/') logic below
2341 let path_str = without_ext.to_string_lossy().replace('\\', "/");
2342
2343 // Check if this is an absolute path or starts with ../ (cross-directory)
2344 // Note: With the fix to always use manifest-relative paths (even with ../),
2345 // lockfiles should never contain absolute paths. We check path.is_absolute()
2346 // defensively for manually-edited lockfiles.
2347 let is_absolute = path.is_absolute();
2348 let is_cross_directory = path_str.starts_with("../");
2349
2350 // If the path has multiple components, skip the first directory (resource type)
2351 // to avoid redundancy, but keep subdirectories for uniqueness
2352 // EXCEPTIONS that keep all components to avoid collisions:
2353 // 1. Absolute paths (e.g., C:/team/tools/foo.md vs D:/team/tools/foo.md)
2354 // 2. Cross-directory paths with ../ (e.g., ../shared/a.md vs ../other/a.md)
2355 let components: Vec<&str> = path_str.split('/').collect();
2356 if is_absolute || is_cross_directory {
2357 // Keep all components to avoid collisions
2358 path_str.to_string()
2359 } else if components.len() > 1 {
2360 // Regular relative path: skip the first component (e.g., "agents", "snippets") and join the rest
2361 components[1..].join("/")
2362 } else {
2363 // Single component, just return it
2364 components[0].to_string()
2365 }
2366 }
2367
2368 /// Resolve all manifest dependencies into a deterministic lockfile.
2369 ///
2370 /// This is the primary entry point for dependency resolution. It resolves all
2371 /// dependencies from the manifest (including transitive dependencies) and
2372 /// generates a complete lockfile with resolved versions and commit SHAs.
2373 ///
2374 /// By default, this method enables transitive dependency resolution. Resources
2375 /// can declare their own dependencies via YAML frontmatter (Markdown) or JSON
2376 /// fields, which will be automatically discovered and resolved.
2377 ///
2378 /// # Transitive Dependency Resolution
2379 ///
2380 /// When enabled (default), the resolver:
2381 /// 1. Resolves direct manifest dependencies
2382 /// 2. Extracts dependency metadata from resource files
2383 /// 3. Builds a dependency graph with cycle detection
2384 /// 4. Resolves transitive dependencies in topological order
2385 ///
2386 /// # Returns
2387 ///
2388 /// A complete [`LockFile`] with all resolved dependencies including:
2389 /// - Resolved commit SHAs for reproducible installations
2390 /// - Checksums for integrity verification
2391 /// - Installation paths for all resources
2392 /// - Source repository information
2393 ///
2394 /// # Errors
2395 ///
2396 /// Returns an error if:
2397 /// - Source repositories cannot be accessed
2398 /// - Version constraints cannot be satisfied
2399 /// - Circular dependencies are detected
2400 /// - Resource files cannot be read or parsed
2401 ///
2402 /// # Example
2403 ///
2404 /// ```rust,no_run
2405 /// # use agpm_cli::resolver::DependencyResolver;
2406 /// # use agpm_cli::manifest::Manifest;
2407 /// # use agpm_cli::cache::Cache;
2408 /// # async fn example() -> anyhow::Result<()> {
2409 /// let manifest = Manifest::load("agpm.toml".as_ref())?;
2410 /// let cache = Cache::new()?;
2411 /// let mut resolver = DependencyResolver::new(manifest, cache)?;
2412 ///
2413 /// // Resolve all dependencies including transitive ones
2414 /// let lockfile = resolver.resolve().await?;
2415 ///
2416 /// lockfile.save("agpm.lock".as_ref())?;
2417 /// println!("Resolved {} total resources",
2418 /// lockfile.agents.len() + lockfile.snippets.len());
2419 /// # Ok(())
2420 /// # }
2421 /// ```
2422 pub async fn resolve(&mut self) -> Result<LockFile> {
2423 self.resolve_with_options(true).await
2424 }
2425
2426 /// Resolve dependencies with configurable transitive dependency support.
2427 ///
2428 /// This method provides fine-grained control over dependency resolution behavior,
2429 /// allowing you to disable transitive dependency resolution when needed. This is
2430 /// useful for debugging, testing, or when you want to install only direct
2431 /// dependencies without their transitive requirements.
2432 ///
2433 /// # Arguments
2434 ///
2435 /// * `enable_transitive` - Whether to resolve transitive dependencies
2436 /// - `true`: Full transitive resolution (default behavior)
2437 /// - `false`: Only direct manifest dependencies
2438 ///
2439 /// # Transitive Resolution Details
2440 ///
2441 /// When `enable_transitive` is `true`:
2442 /// - Resources are checked for embedded dependency metadata
2443 /// - Markdown files (.md): YAML frontmatter between `---` delimiters
2444 /// - JSON files (.json): Top-level `dependencies` field
2445 /// - Dependency graph is built with cycle detection
2446 /// - Dependencies are resolved in topological order
2447 ///
2448 /// When `enable_transitive` is `false`:
2449 /// - Only dependencies explicitly declared in `agpm.toml` are resolved
2450 /// - Resource metadata is not extracted or processed
2451 /// - Faster resolution for known dependency trees
2452 ///
2453 /// # Returns
2454 ///
2455 /// A [`LockFile`] containing all resolved dependencies according to the
2456 /// configuration. When transitive resolution is disabled, the lockfile will
2457 /// only contain direct dependencies from the manifest.
2458 ///
2459 /// # Errors
2460 ///
2461 /// Returns an error if:
2462 /// - Source repositories are inaccessible or invalid
2463 /// - Version constraints conflict or cannot be satisfied
2464 /// - Circular dependencies are detected (when `enable_transitive` is true)
2465 /// - Resource files cannot be read or contain invalid metadata
2466 /// - Network operations fail during source synchronization
2467 ///
2468 /// # Performance Considerations
2469 ///
2470 /// Disabling transitive resolution (`enable_transitive = false`) can improve
2471 /// performance when:
2472 /// - You know all required dependencies are explicitly listed
2473 /// - Testing specific dependency combinations
2474 /// - Debugging dependency resolution issues
2475 /// - Working with large resources that have expensive metadata extraction
2476 ///
2477 /// # Example
2478 ///
2479 /// ```rust,no_run
2480 /// # use agpm_cli::resolver::DependencyResolver;
2481 /// # use agpm_cli::manifest::Manifest;
2482 /// # use agpm_cli::cache::Cache;
2483 /// # async fn example() -> anyhow::Result<()> {
2484 /// let manifest = Manifest::load("agpm.toml".as_ref())?;
2485 /// let cache = Cache::new()?;
2486 /// let mut resolver = DependencyResolver::new(manifest, cache)?;
2487 ///
2488 /// // Resolve only direct dependencies without transitive resolution
2489 /// let lockfile = resolver.resolve_with_options(false).await?;
2490 ///
2491 /// println!("Resolved {} direct dependencies",
2492 /// lockfile.agents.len() + lockfile.snippets.len());
2493 /// # Ok(())
2494 /// # }
2495 /// ```
2496 ///
2497 /// # See Also
2498 ///
2499 /// - [`resolve()`]: Convenience method that enables transitive resolution by default
2500 /// - [`DependencyGraph`]: Graph structure used for cycle detection and ordering
2501 /// - [`DependencySpec`](crate::manifest::DependencySpec): Specification format for transitive dependencies
2502 ///
2503 /// [`resolve()`]: DependencyResolver::resolve
2504 pub async fn resolve_with_options(&mut self, enable_transitive: bool) -> Result<LockFile> {
2505 let mut lockfile = LockFile::new();
2506
2507 // Add sources to lockfile
2508 for (name, url) in &self.manifest.sources {
2509 lockfile.add_source(name.clone(), url.clone(), String::new());
2510 }
2511
2512 // Get all dependencies with their types to avoid mis-typing same-named resources
2513 let base_deps: Vec<(String, ResourceDependency, crate::core::ResourceType)> = self
2514 .manifest
2515 .all_dependencies_with_types()
2516 .into_iter()
2517 .map(|(name, dep, resource_type)| (name.to_string(), dep.into_owned(), resource_type))
2518 .collect();
2519
2520 // Add direct dependencies to conflict detector
2521 for (name, dep, _) in &base_deps {
2522 self.add_to_conflict_detector(name, dep, "manifest");
2523 }
2524
2525 // Show initial message about what we're doing
2526 // Sync sources (phase management is handled by caller)
2527 // prepare_remote_groups only needs name and dep, not type
2528 let base_deps_for_prep: Vec<(String, ResourceDependency)> =
2529 base_deps.iter().map(|(name, dep, _)| (name.clone(), dep.clone())).collect();
2530 self.prepare_remote_groups(&base_deps_for_prep).await?;
2531
2532 // Resolve transitive dependencies if enabled
2533 let deps = self.resolve_transitive_dependencies(&base_deps, enable_transitive).await?;
2534
2535 // Resolve each dependency (including transitive ones)
2536 tracing::debug!("resolve_with_options - about to resolve {} dependencies", deps.len());
2537 for (name, dep, resource_type) in &deps {
2538 tracing::debug!(
2539 "Resolving dependency: {} -> {} (type: {:?})",
2540 name,
2541 dep.get_path(),
2542 resource_type
2543 );
2544 // Progress is tracked at the phase level
2545
2546 // Check if this is a pattern dependency
2547 if dep.is_pattern() {
2548 // Pattern dependencies resolve to multiple resources
2549 let entries = self.resolve_pattern_dependency(name, dep, *resource_type).await?;
2550
2551 // Add each resolved entry to the appropriate resource type with deduplication
2552 // Resource type was already determined during transitive resolution
2553 for entry in entries {
2554 match *resource_type {
2555 crate::core::ResourceType::Agent => {
2556 // Match by (name, source) to allow same-named resources from different sources
2557 if let Some(existing) = lockfile
2558 .agents
2559 .iter_mut()
2560 .find(|e| e.name == entry.name && e.source == entry.source)
2561 {
2562 *existing = entry;
2563 } else {
2564 lockfile.agents.push(entry);
2565 }
2566 }
2567 crate::core::ResourceType::Snippet => {
2568 if let Some(existing) = lockfile
2569 .snippets
2570 .iter_mut()
2571 .find(|e| e.name == entry.name && e.source == entry.source)
2572 {
2573 *existing = entry;
2574 } else {
2575 lockfile.snippets.push(entry);
2576 }
2577 }
2578 crate::core::ResourceType::Command => {
2579 if let Some(existing) = lockfile
2580 .commands
2581 .iter_mut()
2582 .find(|e| e.name == entry.name && e.source == entry.source)
2583 {
2584 *existing = entry;
2585 } else {
2586 lockfile.commands.push(entry);
2587 }
2588 }
2589 crate::core::ResourceType::Script => {
2590 if let Some(existing) = lockfile
2591 .scripts
2592 .iter_mut()
2593 .find(|e| e.name == entry.name && e.source == entry.source)
2594 {
2595 *existing = entry;
2596 } else {
2597 lockfile.scripts.push(entry);
2598 }
2599 }
2600 crate::core::ResourceType::Hook => {
2601 if let Some(existing) = lockfile
2602 .hooks
2603 .iter_mut()
2604 .find(|e| e.name == entry.name && e.source == entry.source)
2605 {
2606 *existing = entry;
2607 } else {
2608 lockfile.hooks.push(entry);
2609 }
2610 }
2611 crate::core::ResourceType::McpServer => {
2612 if let Some(existing) = lockfile
2613 .mcp_servers
2614 .iter_mut()
2615 .find(|e| e.name == entry.name && e.source == entry.source)
2616 {
2617 *existing = entry;
2618 } else {
2619 lockfile.mcp_servers.push(entry);
2620 }
2621 }
2622 }
2623 }
2624 } else {
2625 // Regular single dependency
2626 // Pass the resource type from transitive resolution to ensure correct type
2627 // This handles cases where transitive dependencies share the same name
2628 // as direct manifest dependencies but have a different type
2629 let entry = self.resolve_dependency(name, dep, *resource_type).await?;
2630 tracing::debug!(
2631 "Resolved {} to resource_type={:?}, installed_at={}",
2632 name,
2633 entry.resource_type,
2634 entry.installed_at
2635 );
2636 // Add directly to lockfile to preserve (name, source) uniqueness
2637 self.add_or_update_lockfile_entry(&mut lockfile, name, entry);
2638 }
2639
2640 // Progress is tracked by updating messages, no need to increment
2641 }
2642
2643 // Progress is tracked at the phase level
2644
2645 // Progress completion is handled by the caller
2646
2647 // Detect version conflicts before creating lockfile
2648 let conflicts = self.conflict_detector.detect_conflicts();
2649 if !conflicts.is_empty() {
2650 let mut error_msg = String::from("Version conflicts detected:\n\n");
2651 for conflict in &conflicts {
2652 error_msg.push_str(&format!("{conflict}\n"));
2653 }
2654 return Err(AgpmError::Other {
2655 message: error_msg,
2656 }
2657 .into());
2658 }
2659
2660 // Post-process dependencies to add version information
2661 self.add_version_to_dependencies(&mut lockfile)?;
2662
2663 // Detect target-path conflicts before finalizing
2664 self.detect_target_conflicts(&lockfile)?;
2665
2666 Ok(lockfile)
2667 }
2668
2669 /// Resolves a single dependency to a lockfile entry.
2670 ///
2671 /// This internal method handles the resolution of one dependency, including
2672 /// source synchronization, version resolution, and entry creation.
2673 ///
2674 /// # Algorithm
2675 ///
2676 /// For local dependencies:
2677 /// 1. Validate the path format
2678 /// 2. Determine installation location based on resource type
2679 /// 3. Preserve relative directory structure from source path
2680 /// 4. Create entry with relative path (no source sync required)
2681 ///
2682 /// For remote dependencies:
2683 /// 1. Validate source exists in manifest or global config
2684 /// 2. Synchronize source repository (clone or fetch)
2685 /// 3. Resolve version constraint to specific commit
2686 /// 4. Preserve relative directory structure from dependency path
2687 /// 5. Create entry with resolved commit and source information
2688 ///
2689 /// # Parameters
2690 ///
2691 /// - `name`: The dependency name from the manifest
2692 /// - `dep`: The dependency specification with source, path, and version
2693 ///
2694 /// # Returns
2695 ///
2696 /// A [`LockedResource`] with:
2697 /// - Resolved commit hash (for remote dependencies)
2698 /// - Source and URL information
2699 /// - Installation path in the project
2700 /// - Empty checksum (computed during actual installation)
2701 ///
2702 /// # Errors
2703 ///
2704 /// Returns an error if:
2705 /// - Source is not found in manifest or global config
2706 /// - Source repository cannot be cloned or accessed
2707 /// - Version constraint cannot be resolved (tag/branch not found)
2708 /// - Git operations fail due to network or authentication issues
2709 ///
2710 /// [`LockedResource`]: crate::lockfile::LockedResource
2711 async fn resolve_dependency(
2712 &mut self,
2713 name: &str,
2714 dep: &ResourceDependency,
2715 resource_type: crate::core::ResourceType,
2716 ) -> Result<LockedResource> {
2717 // Check if this is a pattern-based dependency
2718 if dep.is_pattern() {
2719 // Pattern dependencies resolve to multiple resources
2720 // This should be handled by a separate method
2721 return Err(anyhow::anyhow!(
2722 "Pattern dependency '{name}' should be resolved using resolve_pattern_dependency"
2723 ));
2724 }
2725
2726 if dep.is_local() {
2727 // Local dependency - just create entry with path
2728 // Resource type is passed as a parameter to ensure correct type resolution
2729 // for both direct and transitive dependencies
2730
2731 // Determine the filename to use
2732 let filename = if let Some(custom_filename) = dep.get_filename() {
2733 // Use custom filename as-is (includes extension)
2734 custom_filename.to_string()
2735 } else {
2736 // Extract meaningful path structure from dependency path
2737 extract_meaningful_path(Path::new(dep.get_path()))
2738 };
2739
2740 // Determine artifact type - use get_tool() method, then apply defaults
2741 let artifact_type_string = dep
2742 .get_tool()
2743 .map(|s| s.to_string())
2744 .unwrap_or_else(|| self.manifest.get_default_tool(resource_type));
2745 let artifact_type = artifact_type_string.as_str();
2746
2747 // For local resources without a source, just use the name (no version suffix)
2748 let unique_name = name.to_string();
2749
2750 // Hooks and MCP servers are configured in config files, not installed as artifact files
2751 let installed_at = match resource_type {
2752 crate::core::ResourceType::Hook | crate::core::ResourceType::McpServer => {
2753 // Use configured merge target, with fallback to hardcoded defaults
2754 if let Some(merge_target) =
2755 self.manifest.get_merge_target(artifact_type, resource_type)
2756 {
2757 normalize_path_for_storage(merge_target.display().to_string())
2758 } else {
2759 // Fallback to hardcoded defaults if not configured
2760 match resource_type {
2761 crate::core::ResourceType::Hook => {
2762 ".claude/settings.local.json".to_string()
2763 }
2764 crate::core::ResourceType::McpServer => {
2765 if artifact_type == "opencode" {
2766 ".opencode/opencode.json".to_string()
2767 } else {
2768 ".mcp.json".to_string()
2769 }
2770 }
2771 _ => unreachable!(),
2772 }
2773 }
2774 }
2775 _ => {
2776 // For regular resources, get the artifact path
2777 let artifact_path = self
2778 .manifest
2779 .get_artifact_resource_path(artifact_type, resource_type)
2780 .ok_or_else(|| {
2781 // Provide helpful error message with context
2782 let base_msg = format!(
2783 "Resource type '{}' is not supported by tool '{}' for dependency '{}'",
2784 resource_type, artifact_type, name
2785 );
2786
2787 // Check if this looks like a tool name was used as resource type
2788 let resource_type_str = resource_type.to_string();
2789 let hint = if ["claude-code", "opencode", "agpm"].contains(&resource_type_str.as_str()) {
2790 format!(
2791 "\n\nIt looks like '{}' is a tool name, not a resource type.\n\
2792 In transitive dependencies, use resource types (agents, snippets, commands)\n\
2793 as section headers, then specify 'tool: {}' within each dependency.",
2794 resource_type_str, resource_type_str
2795 )
2796 } else {
2797 format!(
2798 "\n\nValid resource types: agent, command, snippet, hook, mcp-server, script\n\
2799 Source file: {}",
2800 dep.get_path()
2801 )
2802 };
2803
2804 anyhow::anyhow!("{}{}", base_msg, hint)
2805 })?;
2806
2807 // Determine flatten behavior: use explicit setting or tool config default
2808 let flatten = dep
2809 .get_flatten()
2810 .or_else(|| {
2811 // Get flatten default from tool config
2812 self.manifest
2813 .get_tool_config(artifact_type)
2814 .and_then(|config| config.resources.get(resource_type.to_plural()))
2815 .and_then(|resource_config| resource_config.flatten)
2816 })
2817 .unwrap_or(false); // Default to false if not configured
2818
2819 let path = if let Some(custom_target) = dep.get_target() {
2820 // Custom target is relative to the artifact's resource directory
2821 let base_target = PathBuf::from(artifact_path.display().to_string())
2822 .join(custom_target.trim_start_matches('/'));
2823 // For custom targets, still strip prefix based on the original artifact path
2824 // (not the custom target), to avoid duplicate directories
2825 let relative_path = compute_relative_install_path(
2826 &artifact_path,
2827 Path::new(&filename),
2828 flatten,
2829 );
2830 base_target.join(relative_path)
2831 } else {
2832 // Use artifact configuration for default path
2833 let relative_path = compute_relative_install_path(
2834 &artifact_path,
2835 Path::new(&filename),
2836 flatten,
2837 );
2838 artifact_path.join(relative_path)
2839 };
2840 normalize_path_for_storage(normalize_path(&path))
2841 }
2842 };
2843
2844 Ok(LockedResource {
2845 name: unique_name,
2846 source: None,
2847 url: None,
2848 path: normalize_path_for_storage(dep.get_path()),
2849 version: None,
2850 resolved_commit: None,
2851 checksum: String::new(),
2852 installed_at,
2853 dependencies: self.get_dependencies_for(name, None, resource_type, dep.get_tool()),
2854 resource_type,
2855 tool: Some(
2856 dep.get_tool()
2857 .map(std::string::ToString::to_string)
2858 .unwrap_or_else(|| self.manifest.get_default_tool(resource_type)),
2859 ),
2860 manifest_alias: self
2861 .pattern_alias_map
2862 .get(&(resource_type, name.to_string()))
2863 .cloned(), // Check if this came from a pattern expansion
2864 applied_patches: HashMap::new(), // Populated during installation, not resolution
2865 })
2866 } else {
2867 // Remote dependency - need to sync and resolve
2868 let source_name = dep.get_source().ok_or_else(|| AgpmError::ConfigError {
2869 message: format!("Dependency '{name}' has no source specified"),
2870 })?;
2871
2872 // Get source URL
2873 let source_url = self.source_manager.get_source_url(source_name).ok_or_else(|| {
2874 AgpmError::SourceNotFound {
2875 name: source_name.to_string(),
2876 }
2877 })?;
2878
2879 let version_key = dep
2880 .get_version()
2881 .map_or_else(|| "HEAD".to_string(), std::string::ToString::to_string);
2882 let prepared_key = Self::group_key(source_name, &version_key);
2883
2884 // Check if this dependency has been prepared
2885 let (resolved_version, resolved_commit) = if let Some(prepared) =
2886 self.prepared_versions.get(&prepared_key)
2887 {
2888 // Use prepared version
2889 (prepared.resolved_version.clone(), prepared.resolved_commit.clone())
2890 } else {
2891 // This dependency wasn't prepared (e.g., when called from `agpm add`)
2892 // We need to prepare it on-demand
2893 let deps = vec![(name.to_string(), dep.clone())];
2894 self.prepare_remote_groups(&deps).await?;
2895
2896 // Now it should be prepared
2897 if let Some(prepared) = self.prepared_versions.get(&prepared_key) {
2898 (prepared.resolved_version.clone(), prepared.resolved_commit.clone())
2899 } else {
2900 return Err(anyhow::anyhow!(
2901 "Failed to prepare dependency '{name}' from source '{source_name}' @ '{version_key}'"
2902 ));
2903 }
2904 };
2905
2906 // Resource type is passed as a parameter to ensure correct type resolution
2907 // for both direct and transitive dependencies
2908
2909 // Determine the filename to use
2910 let filename = if let Some(custom_filename) = dep.get_filename() {
2911 // Use custom filename as-is (includes extension)
2912 custom_filename.to_string()
2913 } else {
2914 // Preserve the dependency path structure directly
2915 let dep_path = Path::new(dep.get_path());
2916 dep_path.to_string_lossy().to_string()
2917 };
2918
2919 // Determine artifact type - use get_tool() method, then apply defaults
2920 // Extract artifact_type from dependency - convert to String for lockfile
2921 let artifact_type_string = dep
2922 .get_tool()
2923 .map(std::string::ToString::to_string)
2924 .unwrap_or_else(|| self.manifest.get_default_tool(resource_type));
2925 let artifact_type = artifact_type_string.as_str();
2926
2927 // Use simple name from manifest - lockfile entries are identified by (name, source)
2928 // Multiple entries with the same name but different sources can coexist
2929 // Version updates replace the existing entry for the same (name, source) pair
2930 let unique_name = name.to_string();
2931
2932 // Hooks and MCP servers are configured in config files, not installed as artifact files
2933 let installed_at = match resource_type {
2934 crate::core::ResourceType::Hook | crate::core::ResourceType::McpServer => {
2935 // Use configured merge target, with fallback to hardcoded defaults
2936 if let Some(merge_target) =
2937 self.manifest.get_merge_target(artifact_type, resource_type)
2938 {
2939 normalize_path_for_storage(merge_target.display().to_string())
2940 } else {
2941 // Fallback to hardcoded defaults if not configured
2942 match resource_type {
2943 crate::core::ResourceType::Hook => {
2944 ".claude/settings.local.json".to_string()
2945 }
2946 crate::core::ResourceType::McpServer => {
2947 if artifact_type == "opencode" {
2948 ".opencode/opencode.json".to_string()
2949 } else {
2950 ".mcp.json".to_string()
2951 }
2952 }
2953 _ => unreachable!(),
2954 }
2955 }
2956 }
2957 _ => {
2958 // For regular resources, get the artifact path
2959 let artifact_path = self
2960 .manifest
2961 .get_artifact_resource_path(artifact_type, resource_type)
2962 .ok_or_else(|| {
2963 // Provide helpful error message with context
2964 let base_msg = format!(
2965 "Resource type '{}' is not supported by tool '{}' for dependency '{}'",
2966 resource_type, artifact_type, name
2967 );
2968
2969 // Check if this looks like a tool name was used as resource type
2970 let resource_type_str = resource_type.to_string();
2971 let hint = if ["claude-code", "opencode", "agpm"].contains(&resource_type_str.as_str()) {
2972 format!(
2973 "\n\nIt looks like '{}' is a tool name, not a resource type.\n\
2974 In transitive dependencies, use resource types (agents, snippets, commands)\n\
2975 as section headers, then specify 'tool: {}' within each dependency.",
2976 resource_type_str, resource_type_str
2977 )
2978 } else {
2979 format!(
2980 "\n\nValid resource types: agent, command, snippet, hook, mcp-server, script\n\
2981 Source file: {}",
2982 dep.get_path()
2983 )
2984 };
2985
2986 anyhow::anyhow!("{}{}", base_msg, hint)
2987 })?;
2988
2989 // Determine flatten behavior: use explicit setting or tool config default
2990 let flatten = dep
2991 .get_flatten()
2992 .or_else(|| {
2993 // Get flatten default from tool config
2994 self.manifest
2995 .get_tool_config(artifact_type)
2996 .and_then(|config| config.resources.get(resource_type.to_plural()))
2997 .and_then(|resource_config| resource_config.flatten)
2998 })
2999 .unwrap_or(false); // Default to false if not configured
3000
3001 let path = if let Some(custom_target) = dep.get_target() {
3002 // Custom target is relative to the artifact's resource directory
3003 let base_target = PathBuf::from(artifact_path.display().to_string())
3004 .join(custom_target.trim_start_matches('/'));
3005 // For custom targets, still strip prefix based on the original artifact path
3006 // (not the custom target), to avoid duplicate directories
3007 let relative_path = compute_relative_install_path(
3008 &artifact_path,
3009 Path::new(&filename),
3010 flatten,
3011 );
3012 base_target.join(relative_path)
3013 } else {
3014 // Use artifact configuration for default path
3015 let relative_path = compute_relative_install_path(
3016 &artifact_path,
3017 Path::new(&filename),
3018 flatten,
3019 );
3020 artifact_path.join(relative_path)
3021 };
3022 normalize_path_for_storage(normalize_path(&path))
3023 }
3024 };
3025
3026 Ok(LockedResource {
3027 name: unique_name,
3028 source: Some(source_name.to_string()),
3029 url: Some(source_url.clone()),
3030 path: normalize_path_for_storage(dep.get_path()),
3031 version: resolved_version, // Resolved version (tag/branch like "v2.1.4" or "main")
3032 resolved_commit: Some(resolved_commit),
3033 checksum: String::new(), // Will be calculated during installation
3034 installed_at,
3035 dependencies: self.get_dependencies_for(
3036 name,
3037 Some(source_name),
3038 resource_type,
3039 dep.get_tool(),
3040 ),
3041 resource_type,
3042 tool: Some(artifact_type_string),
3043 manifest_alias: self
3044 .pattern_alias_map
3045 .get(&(resource_type, name.to_string()))
3046 .cloned(), // Check if this came from a pattern expansion
3047 applied_patches: HashMap::new(), // Populated during installation, not resolution
3048 })
3049 }
3050 }
3051
3052 /// Gets the dependencies for a resource from the dependency map.
3053 ///
3054 /// Returns a list of dependencies in the format "`resource_type/name`".
3055 ///
3056 /// # Parameters
3057 /// - `name`: The resource name
3058 /// - `source`: The source name (None for local dependencies)
3059 fn get_dependencies_for(
3060 &self,
3061 name: &str,
3062 source: Option<&str>,
3063 resource_type: crate::core::ResourceType,
3064 tool: Option<&str>,
3065 ) -> Vec<String> {
3066 // Use the threaded resource_type parameter from the manifest
3067 // This prevents type misclassification when same names exist across types
3068 let key = (
3069 resource_type,
3070 name.to_string(),
3071 source.map(std::string::ToString::to_string),
3072 tool.map(std::string::ToString::to_string),
3073 );
3074 self.dependency_map.get(&key).cloned().unwrap_or_default()
3075 }
3076
3077 /// Resolves a pattern-based dependency to multiple locked resources.
3078 ///
3079 /// Pattern dependencies match multiple resources using glob patterns,
3080 /// enabling batch installation of related resources.
3081 ///
3082 /// # Process
3083 ///
3084 /// 1. Sync the source repository
3085 /// 2. Checkout the specified version (if any)
3086 /// 3. Search for files matching the pattern
3087 /// 4. Preserve relative directory structure for each matched file
3088 /// 5. Create a locked resource for each match
3089 ///
3090 /// # Parameters
3091 ///
3092 /// - `name`: The dependency name (used for the collection)
3093 /// - `dep`: The pattern-based dependency specification
3094 ///
3095 /// # Returns
3096 ///
3097 /// A vector of [`LockedResource`] entries, one for each matched file.
3098 async fn resolve_pattern_dependency(
3099 &mut self,
3100 name: &str,
3101 dep: &ResourceDependency,
3102 resource_type: crate::core::ResourceType,
3103 ) -> Result<Vec<LockedResource>> {
3104 // Pattern dependencies use the path field with glob characters
3105 if !dep.is_pattern() {
3106 return Err(anyhow::anyhow!(
3107 "Expected pattern dependency but no glob characters found in path"
3108 ));
3109 }
3110
3111 let pattern = dep.get_path();
3112
3113 if dep.is_local() {
3114 // Local pattern dependency - search in filesystem
3115 // Extract base path from the pattern if it contains an absolute path
3116 let (base_path, pattern_str) = if pattern.contains('/') || pattern.contains('\\') {
3117 // Pattern contains path separators, extract base path
3118 let pattern_path = Path::new(pattern);
3119 if let Some(parent) = pattern_path.parent() {
3120 if parent.is_absolute() || parent.starts_with("..") || parent.starts_with(".") {
3121 // Use the parent as base path and just the filename pattern
3122 (
3123 parent.to_path_buf(),
3124 pattern_path
3125 .file_name()
3126 .and_then(|s| s.to_str())
3127 .unwrap_or(pattern)
3128 .to_string(),
3129 )
3130 } else {
3131 // Relative path, use current directory as base
3132 (PathBuf::from("."), pattern.to_string())
3133 }
3134 } else {
3135 // No parent, use current directory
3136 (PathBuf::from("."), pattern.to_string())
3137 }
3138 } else {
3139 // Simple pattern without path separators
3140 (PathBuf::from("."), pattern.to_string())
3141 };
3142
3143 let pattern_resolver = crate::pattern::PatternResolver::new();
3144 let matches = pattern_resolver.resolve(&pattern_str, &base_path)?;
3145
3146 // Use the threaded resource_type from the parameter
3147 let mut resources = Vec::new();
3148
3149 for matched_path in matches {
3150 let resource_name = crate::pattern::extract_resource_name(&matched_path);
3151
3152 // Determine artifact type - use get_tool() method, then apply defaults
3153 let artifact_type_string = dep
3154 .get_tool()
3155 .map(|s| s.to_string())
3156 .unwrap_or_else(|| self.manifest.get_default_tool(resource_type));
3157 let artifact_type = artifact_type_string.as_str();
3158
3159 // Construct full relative path from base_path and matched_path
3160 let full_relative_path = if base_path == Path::new(".") {
3161 crate::utils::normalize_path_for_storage(
3162 matched_path.to_string_lossy().to_string(),
3163 )
3164 } else {
3165 crate::utils::normalize_path_for_storage(format!(
3166 "{}/{}",
3167 base_path.display(),
3168 matched_path.display()
3169 ))
3170 };
3171
3172 // Use the threaded resource_type (pattern dependencies inherit from parent)
3173
3174 // Hooks and MCP servers are configured in config files, not installed as artifact files
3175 let installed_at = match resource_type {
3176 crate::core::ResourceType::Hook | crate::core::ResourceType::McpServer => {
3177 // Use configured merge target, with fallback to hardcoded defaults
3178 if let Some(merge_target) =
3179 self.manifest.get_merge_target(artifact_type, resource_type)
3180 {
3181 normalize_path_for_storage(merge_target.display().to_string())
3182 } else {
3183 // Fallback to hardcoded defaults if not configured
3184 match resource_type {
3185 crate::core::ResourceType::Hook => {
3186 ".claude/settings.local.json".to_string()
3187 }
3188 crate::core::ResourceType::McpServer => {
3189 if artifact_type == "opencode" {
3190 ".opencode/opencode.json".to_string()
3191 } else {
3192 ".mcp.json".to_string()
3193 }
3194 }
3195 _ => unreachable!(),
3196 }
3197 }
3198 }
3199 _ => {
3200 // For regular resources, get the artifact path
3201 let artifact_path = self
3202 .manifest
3203 .get_artifact_resource_path(artifact_type, resource_type)
3204 .ok_or_else(|| {
3205 anyhow::anyhow!(
3206 "Resource type '{}' is not supported by tool '{}'",
3207 resource_type,
3208 artifact_type
3209 )
3210 })?;
3211
3212 // Determine flatten behavior: use explicit setting or tool config default
3213 let dep_flatten = dep.get_flatten();
3214 let tool_flatten = self
3215 .manifest
3216 .get_tool_config(artifact_type)
3217 .and_then(|config| config.resources.get(resource_type.to_plural()))
3218 .and_then(|resource_config| resource_config.flatten);
3219
3220 let flatten = dep_flatten.or(tool_flatten).unwrap_or(false); // Default to false if not configured
3221
3222 // Determine the base target directory
3223 let base_target = if let Some(custom_target) = dep.get_target() {
3224 // Custom target is relative to the artifact's resource directory
3225 PathBuf::from(artifact_path.display().to_string())
3226 .join(custom_target.trim_start_matches('/'))
3227 } else {
3228 artifact_path.to_path_buf()
3229 };
3230
3231 // Extract the meaningful path structure
3232 let filename = {
3233 // Construct full path from base_path and matched_path for extraction
3234 let full_path = if base_path == Path::new(".") {
3235 matched_path.to_path_buf()
3236 } else {
3237 base_path.join(&matched_path)
3238 };
3239 extract_meaningful_path(&full_path)
3240 };
3241
3242 // Use compute_relative_install_path to avoid redundant prefixes
3243 let relative_path = compute_relative_install_path(
3244 &base_target,
3245 Path::new(&filename),
3246 flatten,
3247 );
3248 normalize_path_for_storage(normalize_path(&base_target.join(relative_path)))
3249 }
3250 };
3251
3252 resources.push(LockedResource {
3253 name: resource_name.clone(),
3254 source: None,
3255 url: None,
3256 path: full_relative_path,
3257 version: None,
3258 resolved_commit: None,
3259 checksum: String::new(),
3260 installed_at,
3261 dependencies: self.get_dependencies_for(
3262 &resource_name,
3263 None,
3264 resource_type,
3265 dep.get_tool(),
3266 ),
3267 resource_type,
3268 tool: Some(
3269 dep.get_tool()
3270 .map(std::string::ToString::to_string)
3271 .unwrap_or_else(|| self.manifest.get_default_tool(resource_type)),
3272 ),
3273 manifest_alias: Some(name.to_string()), // Pattern dependency: preserve original alias
3274 applied_patches: HashMap::new(), // Populated during installation, not resolution
3275 });
3276 }
3277
3278 Ok(resources)
3279 } else {
3280 // Remote pattern dependency - need to sync and search
3281 let source_name = dep.get_source().ok_or_else(|| AgpmError::ConfigError {
3282 message: format!("Pattern dependency '{name}' has no source specified"),
3283 })?;
3284
3285 let source_url = self.source_manager.get_source_url(source_name).ok_or_else(|| {
3286 AgpmError::SourceNotFound {
3287 name: source_name.to_string(),
3288 }
3289 })?;
3290
3291 let version_key = dep
3292 .get_version()
3293 .map_or_else(|| "HEAD".to_string(), std::string::ToString::to_string);
3294 let prepared_key = Self::group_key(source_name, &version_key);
3295
3296 let prepared = self
3297 .prepared_versions
3298 .get(&prepared_key)
3299 .ok_or_else(|| {
3300 anyhow::anyhow!(
3301 "Prepared state missing for source '{source_name}' @ '{version_key}'. Stage 1 preparation should have populated this entry."
3302 )
3303 })?;
3304
3305 let repo_path = prepared.worktree_path.clone();
3306 let resolved_version = prepared.resolved_version.clone();
3307 let resolved_commit = prepared.resolved_commit.clone();
3308
3309 // Search for matching files in the repository
3310 let pattern_resolver = crate::pattern::PatternResolver::new();
3311 let repo_path_ref = Path::new(&repo_path);
3312 let matches = pattern_resolver.resolve(pattern, repo_path_ref)?;
3313
3314 // Use the threaded resource_type parameter (no need to recompute)
3315 let mut resources = Vec::new();
3316
3317 for matched_path in matches {
3318 let resource_name = crate::pattern::extract_resource_name(&matched_path);
3319
3320 // Determine artifact type - use get_tool() method, then apply defaults
3321 let artifact_type_string = dep
3322 .get_tool()
3323 .map(|s| s.to_string())
3324 .unwrap_or_else(|| self.manifest.get_default_tool(resource_type));
3325 let artifact_type = artifact_type_string.as_str();
3326
3327 // Use the threaded resource_type (pattern dependencies inherit from parent)
3328
3329 // Hooks and MCP servers are configured in config files, not installed as artifact files
3330 let installed_at = match resource_type {
3331 crate::core::ResourceType::Hook | crate::core::ResourceType::McpServer => {
3332 // Use configured merge target, with fallback to hardcoded defaults
3333 if let Some(merge_target) =
3334 self.manifest.get_merge_target(artifact_type, resource_type)
3335 {
3336 normalize_path_for_storage(merge_target.display().to_string())
3337 } else {
3338 // Fallback to hardcoded defaults if not configured
3339 match resource_type {
3340 crate::core::ResourceType::Hook => {
3341 ".claude/settings.local.json".to_string()
3342 }
3343 crate::core::ResourceType::McpServer => {
3344 if artifact_type == "opencode" {
3345 ".opencode/opencode.json".to_string()
3346 } else {
3347 ".mcp.json".to_string()
3348 }
3349 }
3350 _ => unreachable!(),
3351 }
3352 }
3353 }
3354 _ => {
3355 // For regular resources, get the artifact path
3356 let artifact_path = self
3357 .manifest
3358 .get_artifact_resource_path(artifact_type, resource_type)
3359 .ok_or_else(|| {
3360 anyhow::anyhow!(
3361 "Resource type '{}' is not supported by tool '{}'",
3362 resource_type,
3363 artifact_type
3364 )
3365 })?;
3366
3367 // Determine flatten behavior: use explicit setting or tool config default
3368 let dep_flatten = dep.get_flatten();
3369 let tool_flatten = self
3370 .manifest
3371 .get_tool_config(artifact_type)
3372 .and_then(|config| config.resources.get(resource_type.to_plural()))
3373 .and_then(|resource_config| resource_config.flatten);
3374
3375 let flatten = dep_flatten.or(tool_flatten).unwrap_or(false); // Default to false if not configured
3376
3377 // Determine the base target directory
3378 let base_target = if let Some(custom_target) = dep.get_target() {
3379 // Custom target is relative to the artifact's resource directory
3380 PathBuf::from(artifact_path.display().to_string())
3381 .join(custom_target.trim_start_matches('/'))
3382 } else {
3383 artifact_path.to_path_buf()
3384 };
3385
3386 // Extract the meaningful path structure
3387 // 1. For relative paths with "../", strip parent components: "../../snippets/dir/file.md" → "snippets/dir/file.md"
3388 // 2. For absolute paths, resolve ".." first then strip root: "/tmp/foo/../bar/agent.md" → "tmp/bar/agent.md"
3389 // 3. For clean relative paths, use as-is: "agents/test.md" → "agents/test.md"
3390 let filename = {
3391 // Construct full path from repo_path and matched_path for extraction
3392 let full_path = repo_path_ref.join(&matched_path);
3393 let components: Vec<_> = full_path.components().collect();
3394
3395 if full_path.is_absolute() {
3396 // Case 2: Absolute path - resolve ".." components first, then strip root
3397 let mut resolved = Vec::new();
3398
3399 for component in components.iter() {
3400 match component {
3401 std::path::Component::Normal(name) => {
3402 resolved.push(name.to_str().unwrap_or(""));
3403 }
3404 std::path::Component::ParentDir => {
3405 // Pop the last component if there is one
3406 resolved.pop();
3407 }
3408 // Skip RootDir, Prefix, and CurDir
3409 _ => {}
3410 }
3411 }
3412
3413 resolved.join("/")
3414 } else if components
3415 .iter()
3416 .any(|c| matches!(c, std::path::Component::ParentDir))
3417 {
3418 // Case 1: Relative path with "../" - skip all parent components
3419 let start_idx = components
3420 .iter()
3421 .position(|c| matches!(c, std::path::Component::Normal(_)))
3422 .unwrap_or(0);
3423
3424 components[start_idx..]
3425 .iter()
3426 .filter_map(|c| c.as_os_str().to_str())
3427 .collect::<Vec<_>>()
3428 .join("/")
3429 } else {
3430 // Case 3: Clean relative path - use as-is
3431 full_path.to_str().unwrap_or("").replace('\\', "/") // Normalize to forward slashes
3432 }
3433 };
3434
3435 // Use compute_relative_install_path to avoid redundant prefixes
3436 let relative_path = compute_relative_install_path(
3437 &base_target,
3438 Path::new(&filename),
3439 flatten,
3440 );
3441 normalize_path_for_storage(normalize_path(&base_target.join(relative_path)))
3442 }
3443 };
3444
3445 resources.push(LockedResource {
3446 name: resource_name.clone(),
3447 source: Some(source_name.to_string()),
3448 url: Some(source_url.clone()),
3449 path: normalize_path_for_storage(matched_path.to_string_lossy().to_string()),
3450 version: resolved_version.clone(), // Use the resolved version (e.g., "main")
3451 resolved_commit: Some(resolved_commit.clone()),
3452 checksum: String::new(),
3453 installed_at,
3454 dependencies: self.get_dependencies_for(
3455 &resource_name,
3456 Some(source_name),
3457 resource_type,
3458 dep.get_tool(),
3459 ),
3460 resource_type,
3461 tool: Some(
3462 dep.get_tool()
3463 .map(|s| s.to_string())
3464 .unwrap_or_else(|| self.manifest.get_default_tool(resource_type)),
3465 ),
3466 manifest_alias: Some(name.to_string()), // Pattern dependency: preserve original alias
3467 applied_patches: HashMap::new(), // Populated during installation, not resolution
3468 });
3469 }
3470
3471 Ok(resources)
3472 }
3473 }
3474
3475 /// Checks out a specific version in a Git repository.
3476 ///
3477 /// This method implements the version resolution strategy by attempting
3478 /// to checkout Git references in order of preference:
3479 ///
3480 /// 1. **Tags**: Exact tag matches (e.g., `v1.2.3`)
3481 /// 2. **Branches**: Branch heads (e.g., `main`, `develop`)
3482 /// 3. **Commits**: Direct commit hashes (40-character SHA)
3483 ///
3484 /// # Algorithm
3485 ///
3486 /// ```text
3487 /// 1. List all tags in repository
3488 /// 2. If version matches a tag, checkout tag
3489 /// 3. Else attempt branch checkout
3490 /// 4. Else attempt commit hash checkout
3491 /// 5. Return current HEAD commit hash
3492 /// ```
3493 ///
3494 /// # Performance Note
3495 ///
3496 /// Tag listing is cached by Git, making tag lookups efficient.
3497 /// The method avoids unnecessary network operations by checking
3498 /// local references first.
3499 ///
3500 /// # Parameters
3501 ///
3502 /// - `repo`: Git repository handle
3503 /// - `version`: Version constraint (tag, branch, or commit hash)
3504 ///
3505 /// # Returns
3506 ///
3507 /// The commit hash (SHA) of the checked out version.
3508 ///
3509 /// # Errors
3510 ///
3511 /// Returns an error if:
3512 /// - Git repository is in an invalid state
3513 /// - Version string doesn't match any tag, branch, or valid commit
3514 /// - Git checkout fails due to conflicts or permissions
3515 /// - Repository is corrupted or inaccessible
3516 ///
3517 /// Determines the resource type (agent or snippet) from a dependency name.
3518 ///
3519 /// This method checks which manifest section contains the dependency
3520 /// to determine where it should be installed in the project.
3521 ///
3522 /// # Resource Type Mapping
3523 ///
3524 /// - **agents**: Dependencies listed in `[agents]` section
3525 /// - **snippets**: Dependencies listed in `[snippets]` section
3526 ///
3527 /// # Installation Paths
3528 ///
3529 /// Resource types determine installation directories based on tool configuration:
3530 /// - Agents typically install to `.claude/agents/{name}.md` (claude-code tool)
3531 /// - Snippets typically install to `.agpm/snippets/{name}.md` (agpm tool)
3532 ///
3533 /// # Parameters
3534 ///
3535 /// - `name`: Dependency name as defined in manifest
3536 ///
3537 /// # Returns
3538 ///
3539 /// Resource type as a string: `"agent"` or `"snippet"`.
3540 ///
3541 /// # Default Behavior
3542 ///
3543 /// If a dependency is not found in the agents section, it defaults
3544 /// to `"snippet"`. This handles edge cases and maintains backward compatibility.
3545 /// Resolve version conflicts between two dependencies.
3546 ///
3547 /// This method implements version conflict resolution strategies when the same
3548 /// resource is required with different versions by different dependencies.
3549 ///
3550 /// # Resolution Strategy
3551 ///
3552 /// The current implementation uses a "highest compatible version" strategy:
3553 /// 1. If one dependency has no version (latest), use the other's version
3554 /// 2. If both have versions, prefer semantic version comparison
3555 /// 3. For incompatible versions, warn and use the higher version
3556 ///
3557 /// # Future Enhancements
3558 ///
3559 /// - Support for version ranges (^1.0.0, ~2.1.0)
3560 /// - User-configurable resolution strategies
3561 /// - Interactive conflict resolution
3562 ///
3563 /// # Parameters
3564 ///
3565 /// - `resource_name`: Name of the conflicting resource
3566 /// - `existing`: Current version in the dependency map
3567 /// - `new_dep`: New version being requested
3568 /// - `requester`: Name of the dependency requesting the new version
3569 ///
3570 /// # Returns
3571 ///
3572 /// The resolved dependency that satisfies both requirements if possible,
3573 /// or the higher version with a warning if not compatible.
3574 async fn resolve_version_conflict(
3575 &self,
3576 resource_name: &str,
3577 existing: &ResourceDependency,
3578 new_dep: &ResourceDependency,
3579 requester: &str,
3580 ) -> Result<ResourceDependency> {
3581 let existing_version = existing.get_version();
3582 let new_version = new_dep.get_version();
3583
3584 // If versions are identical, no conflict
3585 if existing_version == new_version {
3586 return Ok(existing.clone());
3587 }
3588
3589 // Check if either version is a semver range (not an exact version)
3590 let is_existing_range = existing_version.is_some_and(|v| {
3591 v.starts_with('^') || v.starts_with('~') || v.starts_with('>') || v.starts_with('<')
3592 });
3593 let is_new_range = new_version.is_some_and(|v| {
3594 v.starts_with('^') || v.starts_with('~') || v.starts_with('>') || v.starts_with('<')
3595 });
3596
3597 // If we have semver ranges, resolve them to a concrete version
3598 if is_existing_range || is_new_range {
3599 return self
3600 .resolve_semver_range_conflict(resource_name, existing, new_dep, requester)
3601 .await;
3602 }
3603
3604 // Log the conflict for user awareness
3605 tracing::warn!(
3606 "Version conflict for '{}': existing version {:?} vs {:?} required by '{}'",
3607 resource_name,
3608 existing_version.unwrap_or("HEAD"),
3609 new_version.unwrap_or("HEAD"),
3610 requester
3611 );
3612
3613 // Resolution strategy
3614 match (existing_version, new_version) {
3615 (None, Some(_)) => {
3616 // Existing wants HEAD, new wants specific - use specific
3617 Ok(new_dep.clone())
3618 }
3619 (Some(_), None) => {
3620 // Existing wants specific, new wants HEAD - keep specific
3621 Ok(existing.clone())
3622 }
3623 (Some(v1), Some(v2)) => {
3624 // Both have versions - use semver-aware comparison
3625 use semver::Version;
3626
3627 // Try to parse as semver (strip 'v' prefix if present)
3628 let v1_semver = Version::parse(v1.trim_start_matches('v')).ok();
3629 let v2_semver = Version::parse(v2.trim_start_matches('v')).ok();
3630
3631 match (v1_semver, v2_semver) {
3632 (Some(sv1), Some(sv2)) => {
3633 // Both are valid semver - use proper semver comparison
3634 if sv1 >= sv2 {
3635 tracing::info!(
3636 "Resolving conflict: using version {} for {} (semver: {} >= {})",
3637 v1,
3638 resource_name,
3639 sv1,
3640 sv2
3641 );
3642 Ok(existing.clone())
3643 } else {
3644 tracing::info!(
3645 "Resolving conflict: using version {} for {} (semver: {} < {})",
3646 v2,
3647 resource_name,
3648 sv1,
3649 sv2
3650 );
3651 Ok(new_dep.clone())
3652 }
3653 }
3654 (Some(_), None) => {
3655 // v1 is semver, v2 is not (branch/commit) - prefer semver
3656 tracing::info!(
3657 "Resolving conflict: preferring semver version {} over git ref {} for {}",
3658 v1,
3659 v2,
3660 resource_name
3661 );
3662 Ok(existing.clone())
3663 }
3664 (None, Some(_)) => {
3665 // v1 is not semver (branch/commit), v2 is - prefer semver
3666 tracing::info!(
3667 "Resolving conflict: preferring semver version {} over git ref {} for {}",
3668 v2,
3669 v1,
3670 resource_name
3671 );
3672 Ok(new_dep.clone())
3673 }
3674 (None, None) => {
3675 // Neither is semver (both branches/commits)
3676 // Use deterministic ordering: alphabetical
3677 if v1 <= v2 {
3678 tracing::info!(
3679 "Resolving conflict: using git ref {} for {} (alphabetically first)",
3680 v1,
3681 resource_name
3682 );
3683 Ok(existing.clone())
3684 } else {
3685 tracing::info!(
3686 "Resolving conflict: using git ref {} for {} (alphabetically first)",
3687 v2,
3688 resource_name
3689 );
3690 Ok(new_dep.clone())
3691 }
3692 }
3693 }
3694 }
3695 (None, None) => {
3696 // Both want HEAD - no conflict
3697 Ok(existing.clone())
3698 }
3699 }
3700 }
3701
3702 /// Resolve a version conflict where at least one version is a semver range.
3703 ///
3704 /// This method handles version resolution when one or both dependencies specify
3705 /// semver ranges (like `^1.0.0`, `>=1.5.0`) instead of exact versions. It:
3706 /// 1. Fetches available versions from the Git repository
3707 /// 2. Uses the `ConflictDetector` to check if ranges are compatible
3708 /// 3. Finds the best version that satisfies both constraints
3709 /// 4. Returns a resolved dependency with the concrete version
3710 ///
3711 /// # Arguments
3712 ///
3713 /// * `resource_name` - Name of the conflicting resource
3714 /// * `existing` - Current dependency with version range
3715 /// * `new_dep` - New dependency with version range
3716 /// * `requester` - Name of the dependency requesting the new version
3717 ///
3718 /// # Returns
3719 ///
3720 /// A `ResourceDependency` with the resolved concrete version that satisfies both ranges.
3721 ///
3722 /// # Errors
3723 ///
3724 /// Returns an error if:
3725 /// - Dependencies are local (no source) but have version ranges
3726 /// - Source hasn't been synced yet
3727 /// - Version ranges are incompatible (no common version)
3728 /// - No available versions satisfy the constraints
3729 async fn resolve_semver_range_conflict(
3730 &self,
3731 resource_name: &str,
3732 existing: &ResourceDependency,
3733 new_dep: &ResourceDependency,
3734 requester: &str,
3735 ) -> Result<ResourceDependency> {
3736 use crate::manifest::DetailedDependency;
3737 use crate::resolver::version_resolution::parse_tags_to_versions;
3738 use semver::Version;
3739 use std::collections::HashMap;
3740
3741 let existing_version = existing.get_version().unwrap_or("HEAD");
3742 let new_version = new_dep.get_version().unwrap_or("HEAD");
3743
3744 tracing::info!(
3745 "Resolving semver range conflict for '{}': existing '{}' vs required '{}' by '{}'",
3746 resource_name,
3747 existing_version,
3748 new_version,
3749 requester
3750 );
3751
3752 // Get source (both should have same source for transitive deps)
3753 let source = existing.get_source().or_else(|| new_dep.get_source()).ok_or_else(|| {
3754 AgpmError::Other {
3755 message: format!(
3756 "Cannot resolve semver ranges for local dependencies: {}",
3757 resource_name
3758 ),
3759 }
3760 })?;
3761
3762 // Get bare repo path
3763 let repo_path =
3764 self.version_resolver.get_bare_repo_path(source).ok_or_else(|| AgpmError::Other {
3765 message: format!("Source '{}' not synced yet", source),
3766 })?;
3767
3768 // List available tags from repository
3769 let tags = self.get_available_versions(repo_path).await?;
3770 tracing::debug!("Found {} tags for source '{}'", tags.len(), source);
3771
3772 // Parse tags to semver versions
3773 let parsed_versions = parse_tags_to_versions(tags);
3774 let available_versions: Vec<Version> =
3775 parsed_versions.iter().map(|(_, v)| v.clone()).collect();
3776
3777 if available_versions.is_empty() {
3778 return Err(AgpmError::Other {
3779 message: format!(
3780 "No valid semver tags found for source '{}' to resolve range conflict",
3781 source
3782 ),
3783 }
3784 .into());
3785 }
3786
3787 // Use ConflictDetector to find best version
3788 let mut detector = ConflictDetector::new();
3789 let resource_id = format!("{}:{}", source, existing.get_path());
3790 detector.add_requirement(&resource_id, "existing", existing_version);
3791 detector.add_requirement(&resource_id, requester, new_version);
3792
3793 // Check for conflicts
3794 let conflicts = detector.detect_conflicts();
3795 if !conflicts.is_empty() {
3796 return Err(AgpmError::Other {
3797 message: format!(
3798 "Incompatible version ranges for '{}': existing '{}' vs required '{}' by '{}'",
3799 resource_name, existing_version, new_version, requester
3800 ),
3801 }
3802 .into());
3803 }
3804
3805 // Find best version that satisfies both ranges
3806 let mut versions_map = HashMap::new();
3807 versions_map.insert(resource_id.clone(), available_versions);
3808
3809 let resolved = detector.resolve_conflicts(&versions_map)?;
3810 let best_version = resolved.get(&resource_id).ok_or_else(|| AgpmError::Other {
3811 message: format!("Failed to resolve version for '{}'", resource_name),
3812 })?;
3813
3814 // Find the tag for this version
3815 let best_tag = parsed_versions
3816 .iter()
3817 .find(|(_, v)| v == best_version)
3818 .map(|(tag, _)| tag.clone())
3819 .ok_or_else(|| AgpmError::Other {
3820 message: format!("Version {} not found in tags", best_version),
3821 })?;
3822
3823 tracing::info!(
3824 "Resolved '{}' to version {} (satisfies both '{}' and '{}')",
3825 resource_name,
3826 best_tag,
3827 existing_version,
3828 new_version
3829 );
3830
3831 // Create new dependency with resolved version
3832 // We need to create a Detailed variant with the concrete version
3833 match new_dep {
3834 ResourceDependency::Detailed(d) => {
3835 let mut resolved_dep = (**d).clone();
3836 resolved_dep.version = Some(best_tag);
3837 resolved_dep.branch = None; // Clear branch/rev since we have concrete version
3838 resolved_dep.rev = None;
3839 Ok(ResourceDependency::Detailed(Box::new(resolved_dep)))
3840 }
3841 ResourceDependency::Simple(_) => {
3842 // Create a DetailedDependency from the simple path
3843 // This shouldn't happen since Simple deps don't have versions, but handle it
3844 Ok(ResourceDependency::Detailed(Box::new(DetailedDependency {
3845 source: Some(source.to_string()),
3846 path: existing.get_path().to_string(),
3847 version: Some(best_tag),
3848 branch: None,
3849 rev: None,
3850 command: None,
3851 args: None,
3852 target: None,
3853 filename: None,
3854 dependencies: None,
3855 tool: Some("claude-code".to_string()), // Default tool
3856 flatten: None,
3857 })))
3858 }
3859 }
3860 }
3861
3862 /// Updates an existing lockfile with new or changed dependencies.
3863 ///
3864 /// This method performs incremental dependency resolution by comparing
3865 /// the current manifest against an existing lockfile and updating only
3866 /// the specified dependencies (or all if none specified).
3867 ///
3868 /// # Update Strategy
3869 ///
3870 /// The update process follows these steps:
3871 /// 1. **Selective Resolution**: Only resolve specified dependencies
3872 /// 2. **Preserve Existing**: Keep unchanged dependencies from existing lockfile
3873 /// 3. **In-place Updates**: Replace matching entries with new versions
3874 /// 4. **New Additions**: Append newly added dependencies
3875 ///
3876 /// # Use Cases
3877 ///
3878 /// - **Selective Updates**: Update specific outdated dependencies
3879 /// - **Security Patches**: Update dependencies with known vulnerabilities
3880 /// - **Feature Updates**: Pull latest versions for active development
3881 /// - **Manifest Changes**: Reflect additions/modifications to agpm.toml
3882 ///
3883 /// # Parameters
3884 ///
3885 /// - `existing`: Current lockfile to update
3886 /// - `deps_to_update`: Optional list of specific dependencies to update.
3887 /// If `None`, all dependencies are updated.
3888 /// - `progress`: Optional progress bar for user feedback
3889 ///
3890 /// # Returns
3891 ///
3892 /// A new [`LockFile`] with updated dependencies. The original lockfile
3893 /// structure is preserved, with only specified entries modified.
3894 ///
3895 /// # Algorithm Complexity
3896 ///
3897 /// - **Time**: O(u + s·log(t)) where u = dependencies to update
3898 /// - **Space**: O(n) where n = total dependencies in lockfile
3899 ///
3900 /// # Performance Benefits
3901 ///
3902 /// - **Network Optimization**: Only syncs sources for updated dependencies
3903 /// - **Cache Utilization**: Reuses existing source repositories
3904 /// - **Parallel Processing**: Updates multiple dependencies concurrently
3905 ///
3906 /// # Errors
3907 ///
3908 /// Update can fail due to:
3909 /// - Network issues accessing source repositories
3910 /// - Version constraints that cannot be satisfied
3911 /// - Authentication failures for private sources
3912 /// - Corrupted or inaccessible cache directories
3913 ///
3914 /// [`LockFile`]: crate::lockfile::LockFile
3915 pub async fn update(
3916 &mut self,
3917 existing: &LockFile,
3918 deps_to_update: Option<Vec<String>>,
3919 ) -> Result<LockFile> {
3920 let mut lockfile = existing.clone();
3921
3922 // Update sources from manifest (handles source URL changes and new sources)
3923 lockfile.sources.clear();
3924 for (name, url) in &self.manifest.sources {
3925 lockfile.add_source(name.clone(), url.clone(), String::new());
3926 }
3927
3928 // Determine which dependencies to update
3929 let deps_to_check: HashSet<String> = if let Some(specific) = deps_to_update {
3930 specific.into_iter().collect()
3931 } else {
3932 // Update all dependencies
3933 self.manifest.all_dependencies().iter().map(|(name, _)| (*name).to_string()).collect()
3934 };
3935
3936 // Get all base dependencies with their types to avoid mis-typing same-named resources
3937 let base_deps: Vec<(String, ResourceDependency, crate::core::ResourceType)> = self
3938 .manifest
3939 .all_dependencies_with_types()
3940 .into_iter()
3941 .map(|(name, dep, resource_type)| (name.to_string(), dep.into_owned(), resource_type))
3942 .collect();
3943
3944 // Note: We assume the update command has already called pre_sync_sources
3945 // during the "Syncing sources" phase, so repositories are already available.
3946 // We just need to prepare and resolve versions now.
3947
3948 // Prepare remote groups to resolve versions (reuses pre-synced repos)
3949 // prepare_remote_groups only needs name and dep, not type
3950 let base_deps_for_prep: Vec<(String, ResourceDependency)> =
3951 base_deps.iter().map(|(name, dep, _)| (name.clone(), dep.clone())).collect();
3952 self.prepare_remote_groups(&base_deps_for_prep).await?;
3953
3954 // First, remove stale entries that are no longer in the manifest
3955 // This prevents conflicts from commented-out or removed dependencies
3956 self.remove_stale_manifest_entries(&mut lockfile);
3957
3958 // Remove old entries for manifest dependencies being updated
3959 // This handles source changes (e.g., Git -> local path) and type changes
3960 self.remove_manifest_entries_for_update(&mut lockfile, &deps_to_check);
3961
3962 // Resolve transitive dependencies (always enabled for update to maintain consistency)
3963 let deps = self.resolve_transitive_dependencies(&base_deps, true).await?;
3964
3965 for (name, dep, resource_type) in deps {
3966 // Determine if this dependency should be skipped:
3967 // Skip ONLY if it's a manifest dependency that's NOT being updated
3968 // Always process:
3969 // - Manifest deps being updated (name in deps_to_check)
3970 // - Pattern expansions whose alias is being updated
3971 // - Transitive deps (not in manifest at all)
3972
3973 let is_manifest_dep = self.manifest.all_dependencies().iter().any(|(n, _)| *n == name);
3974 let pattern_alias = self.pattern_alias_map.get(&(resource_type, name.clone()));
3975
3976 let should_skip = is_manifest_dep
3977 && !deps_to_check.contains(&name)
3978 && !pattern_alias.is_some_and(|alias| deps_to_check.contains(alias));
3979
3980 if should_skip {
3981 // This is a manifest dependency not being updated - skip to avoid unnecessary work
3982 continue;
3983 }
3984
3985 // Check if this is a pattern dependency
3986 if dep.is_pattern() {
3987 // Pattern dependencies resolve to multiple resources
3988 let entries = self.resolve_pattern_dependency(&name, &dep, resource_type).await?;
3989
3990 // Add each resolved entry to the appropriate resource type with deduplication
3991 // Use resource_type from transitive resolution to avoid recomputing
3992 for entry in entries {
3993 match resource_type {
3994 crate::core::ResourceType::Agent => {
3995 // Match by (name, source) to allow same-named resources from different sources
3996 if let Some(existing) = lockfile
3997 .agents
3998 .iter_mut()
3999 .find(|e| e.name == entry.name && e.source == entry.source)
4000 {
4001 *existing = entry;
4002 } else {
4003 lockfile.agents.push(entry);
4004 }
4005 }
4006 crate::core::ResourceType::Snippet => {
4007 if let Some(existing) = lockfile
4008 .snippets
4009 .iter_mut()
4010 .find(|e| e.name == entry.name && e.source == entry.source)
4011 {
4012 *existing = entry;
4013 } else {
4014 lockfile.snippets.push(entry);
4015 }
4016 }
4017 crate::core::ResourceType::Command => {
4018 if let Some(existing) = lockfile
4019 .commands
4020 .iter_mut()
4021 .find(|e| e.name == entry.name && e.source == entry.source)
4022 {
4023 *existing = entry;
4024 } else {
4025 lockfile.commands.push(entry);
4026 }
4027 }
4028 crate::core::ResourceType::Script => {
4029 if let Some(existing) = lockfile
4030 .scripts
4031 .iter_mut()
4032 .find(|e| e.name == entry.name && e.source == entry.source)
4033 {
4034 *existing = entry;
4035 } else {
4036 lockfile.scripts.push(entry);
4037 }
4038 }
4039 crate::core::ResourceType::Hook => {
4040 if let Some(existing) = lockfile
4041 .hooks
4042 .iter_mut()
4043 .find(|e| e.name == entry.name && e.source == entry.source)
4044 {
4045 *existing = entry;
4046 } else {
4047 lockfile.hooks.push(entry);
4048 }
4049 }
4050 crate::core::ResourceType::McpServer => {
4051 if let Some(existing) = lockfile
4052 .mcp_servers
4053 .iter_mut()
4054 .find(|e| e.name == entry.name && e.source == entry.source)
4055 {
4056 *existing = entry;
4057 } else {
4058 lockfile.mcp_servers.push(entry);
4059 }
4060 }
4061 }
4062 }
4063 } else {
4064 // Regular single dependency
4065 // Use resource_type from transitive resolution to avoid recomputing
4066 let entry = self.resolve_dependency(&name, &dep, resource_type).await?;
4067
4068 // Use the helper method to add or update the entry
4069 self.add_or_update_lockfile_entry(&mut lockfile, &name, entry);
4070 }
4071 }
4072
4073 // Progress bar completion is handled by the caller
4074
4075 // Post-process dependencies to add version information
4076 self.add_version_to_dependencies(&mut lockfile)?;
4077
4078 // Detect target-path conflicts before finalizing
4079 self.detect_target_conflicts(&lockfile)?;
4080
4081 Ok(lockfile)
4082 }
4083
4084 /// Adds a dependency to the conflict detector.
4085 ///
4086 /// Builds a resource identifier from the dependency's source and path,
4087 /// and records it along with the requirer and version constraint.
4088 ///
4089 /// # Parameters
4090 ///
4091 /// - `_name`: The dependency name (unused, kept for consistency)
4092 /// - `dep`: The dependency specification
4093 /// - `required_by`: Identifier of the resource requiring this dependency
4094 fn add_to_conflict_detector(
4095 &mut self,
4096 _name: &str,
4097 dep: &ResourceDependency,
4098 required_by: &str,
4099 ) {
4100 // Skip local dependencies (no version conflicts possible)
4101 if dep.is_local() {
4102 return;
4103 }
4104
4105 // Build resource identifier: source:path
4106 let source = dep.get_source().unwrap_or("unknown");
4107 let path = dep.get_path();
4108 let resource_id = format!("{source}:{path}");
4109
4110 // Get version constraint (None means HEAD/unspecified)
4111 if let Some(version) = dep.get_version() {
4112 // Add to conflict detector
4113 self.conflict_detector.add_requirement(&resource_id, required_by, version);
4114 } else {
4115 // No version specified - use HEAD marker
4116 self.conflict_detector.add_requirement(&resource_id, required_by, "HEAD");
4117 }
4118 }
4119
4120 /// Post-processes lockfile entries to add version information to dependencies.
4121 ///
4122 /// Updates the `dependencies` field in each lockfile entry from the format
4123 /// `"resource_type/name"` to `"resource_type/name@version"` by looking up
4124 /// the resolved version in the lockfile.
4125 fn add_version_to_dependencies(&self, lockfile: &mut LockFile) -> Result<()> {
4126 // Build a lookup map: (resource_type, path, source) -> unique_name
4127 // This allows us to resolve dependency paths to lockfile names
4128 // We store both the full path and just the filename for flexible lookup
4129 let mut lookup_map: HashMap<(crate::core::ResourceType, String, Option<String>), String> =
4130 HashMap::new();
4131
4132 // Helper to normalize path (strip leading ./, etc.)
4133 let normalize_path = |path: &str| -> String { path.trim_start_matches("./").to_string() };
4134
4135 // Helper to extract filename from path
4136 let extract_filename = |path: &str| -> Option<String> {
4137 path.split('/').next_back().map(std::string::ToString::to_string)
4138 };
4139
4140 // Build lookup map from all lockfile entries
4141 for entry in &lockfile.agents {
4142 let normalized_path = normalize_path(&entry.path);
4143 // Store by full path
4144 lookup_map.insert(
4145 (crate::core::ResourceType::Agent, normalized_path.clone(), entry.source.clone()),
4146 entry.name.clone(),
4147 );
4148 // Also store by filename for backward compatibility
4149 if let Some(filename) = extract_filename(&entry.path) {
4150 lookup_map.insert(
4151 (crate::core::ResourceType::Agent, filename, entry.source.clone()),
4152 entry.name.clone(),
4153 );
4154 }
4155 }
4156 for entry in &lockfile.snippets {
4157 let normalized_path = normalize_path(&entry.path);
4158 lookup_map.insert(
4159 (crate::core::ResourceType::Snippet, normalized_path.clone(), entry.source.clone()),
4160 entry.name.clone(),
4161 );
4162 if let Some(filename) = extract_filename(&entry.path) {
4163 lookup_map.insert(
4164 (crate::core::ResourceType::Snippet, filename, entry.source.clone()),
4165 entry.name.clone(),
4166 );
4167 }
4168 }
4169 for entry in &lockfile.commands {
4170 let normalized_path = normalize_path(&entry.path);
4171 lookup_map.insert(
4172 (crate::core::ResourceType::Command, normalized_path.clone(), entry.source.clone()),
4173 entry.name.clone(),
4174 );
4175 if let Some(filename) = extract_filename(&entry.path) {
4176 lookup_map.insert(
4177 (crate::core::ResourceType::Command, filename, entry.source.clone()),
4178 entry.name.clone(),
4179 );
4180 }
4181 }
4182 for entry in &lockfile.scripts {
4183 let normalized_path = normalize_path(&entry.path);
4184 lookup_map.insert(
4185 (crate::core::ResourceType::Script, normalized_path.clone(), entry.source.clone()),
4186 entry.name.clone(),
4187 );
4188 if let Some(filename) = extract_filename(&entry.path) {
4189 lookup_map.insert(
4190 (crate::core::ResourceType::Script, filename, entry.source.clone()),
4191 entry.name.clone(),
4192 );
4193 }
4194 }
4195 for entry in &lockfile.hooks {
4196 let normalized_path = normalize_path(&entry.path);
4197 lookup_map.insert(
4198 (crate::core::ResourceType::Hook, normalized_path.clone(), entry.source.clone()),
4199 entry.name.clone(),
4200 );
4201 if let Some(filename) = extract_filename(&entry.path) {
4202 lookup_map.insert(
4203 (crate::core::ResourceType::Hook, filename, entry.source.clone()),
4204 entry.name.clone(),
4205 );
4206 }
4207 }
4208 for entry in &lockfile.mcp_servers {
4209 let normalized_path = normalize_path(&entry.path);
4210 lookup_map.insert(
4211 (
4212 crate::core::ResourceType::McpServer,
4213 normalized_path.clone(),
4214 entry.source.clone(),
4215 ),
4216 entry.name.clone(),
4217 );
4218 if let Some(filename) = extract_filename(&entry.path) {
4219 lookup_map.insert(
4220 (crate::core::ResourceType::McpServer, filename, entry.source.clone()),
4221 entry.name.clone(),
4222 );
4223 }
4224 }
4225
4226 // Build a complete map of (resource_type, name, source) -> (source, version) for cross-source lookup
4227 // This needs to be done before we start mutating entries
4228 let mut resource_info_map: HashMap<ResourceKey, ResourceInfo> = HashMap::new();
4229
4230 for entry in &lockfile.agents {
4231 resource_info_map.insert(
4232 (crate::core::ResourceType::Agent, entry.name.clone(), entry.source.clone()),
4233 (entry.source.clone(), entry.version.clone()),
4234 );
4235 }
4236 for entry in &lockfile.snippets {
4237 resource_info_map.insert(
4238 (crate::core::ResourceType::Snippet, entry.name.clone(), entry.source.clone()),
4239 (entry.source.clone(), entry.version.clone()),
4240 );
4241 }
4242 for entry in &lockfile.commands {
4243 resource_info_map.insert(
4244 (crate::core::ResourceType::Command, entry.name.clone(), entry.source.clone()),
4245 (entry.source.clone(), entry.version.clone()),
4246 );
4247 }
4248 for entry in &lockfile.scripts {
4249 resource_info_map.insert(
4250 (crate::core::ResourceType::Script, entry.name.clone(), entry.source.clone()),
4251 (entry.source.clone(), entry.version.clone()),
4252 );
4253 }
4254 for entry in &lockfile.hooks {
4255 resource_info_map.insert(
4256 (crate::core::ResourceType::Hook, entry.name.clone(), entry.source.clone()),
4257 (entry.source.clone(), entry.version.clone()),
4258 );
4259 }
4260 for entry in &lockfile.mcp_servers {
4261 resource_info_map.insert(
4262 (crate::core::ResourceType::McpServer, entry.name.clone(), entry.source.clone()),
4263 (entry.source.clone(), entry.version.clone()),
4264 );
4265 }
4266
4267 // Helper function to update dependencies in a vector of entries
4268 let update_deps = |entries: &mut Vec<LockedResource>| {
4269 for entry in entries {
4270 let parent_source = entry.source.clone();
4271
4272 let updated_deps: Vec<String> =
4273 entry
4274 .dependencies
4275 .iter()
4276 .map(|dep| {
4277 // Parse "resource_type/path" format (e.g., "agent/rust-haiku.md" or "snippet/utils.md")
4278 if let Some((_resource_type_str, dep_path)) = dep.split_once('/') {
4279 // Parse resource type from string form (accepts both singular and plural)
4280 if let Ok(resource_type) =
4281 _resource_type_str.parse::<crate::core::ResourceType>()
4282 {
4283 // Normalize the path for lookup
4284 let dep_filename = normalize_path(dep_path);
4285
4286 // Look up the resource in the lookup map (same source as parent)
4287 if let Some(dep_name) = lookup_map.get(&(
4288 resource_type,
4289 dep_filename.clone(),
4290 parent_source.clone(),
4291 )) {
4292 // Found resource in same source - use singular form from enum
4293 return format!("{resource_type}/{dep_name}");
4294 }
4295
4296 // If not found with same source, try adding .md extension
4297 let dep_filename_with_ext = format!("{dep_filename}.md");
4298 if let Some(dep_name) = lookup_map.get(&(
4299 resource_type,
4300 dep_filename_with_ext.clone(),
4301 parent_source.clone(),
4302 )) {
4303 return format!("{resource_type}/{dep_name}");
4304 }
4305
4306 // Try looking for resource from ANY source (cross-source dependency)
4307 // Format: source:type/name:version
4308 for ((rt, filename, src), name) in &lookup_map {
4309 if *rt == resource_type
4310 && (filename == &dep_filename
4311 || filename == &dep_filename_with_ext)
4312 {
4313 // Found in different source - need to include source and version
4314 // Use the pre-built resource info map
4315 if let Some((source, version)) = resource_info_map
4316 .get(&(resource_type, name.clone(), src.clone()))
4317 {
4318 // Build full reference: source:type/name:version
4319 let mut dep_ref = String::new();
4320 if let Some(src) = source {
4321 dep_ref.push_str(src);
4322 dep_ref.push(':');
4323 }
4324 dep_ref.push_str(&resource_type.to_string());
4325 dep_ref.push('/');
4326 dep_ref.push_str(name);
4327 if let Some(ver) = version {
4328 dep_ref.push(':');
4329 dep_ref.push_str(ver);
4330 }
4331 return dep_ref;
4332 }
4333 }
4334 }
4335 }
4336 }
4337 // If parsing fails or resource not found, return as-is
4338 dep.clone()
4339 })
4340 .collect();
4341
4342 entry.dependencies = updated_deps;
4343 }
4344 };
4345
4346 // Update all entry types
4347 update_deps(&mut lockfile.agents);
4348 update_deps(&mut lockfile.snippets);
4349 update_deps(&mut lockfile.commands);
4350 update_deps(&mut lockfile.scripts);
4351 update_deps(&mut lockfile.hooks);
4352 update_deps(&mut lockfile.mcp_servers);
4353
4354 Ok(())
4355 }
4356
4357 /// Verifies that all dependencies can be resolved without performing resolution.
4358 ///
4359 /// This method performs a "dry run" validation of the manifest to detect
4360 /// issues before attempting actual resolution. It's faster than full resolution
4361 /// since it doesn't clone repositories or resolve specific versions.
4362 ///
4363 /// # Validation Steps
4364 ///
4365 /// 1. **Local Path Validation**: Verify local dependencies exist (for absolute paths)
4366 /// 2. **Source Validation**: Ensure all referenced sources are defined
4367 /// 3. **Constraint Validation**: Basic syntax checking of version constraints
4368 ///
4369 /// # Validation Scope
4370 ///
4371 /// - **Manifest Structure**: Validate TOML structure and required fields
4372 /// - **Source References**: Ensure all sources used by dependencies exist
4373 /// - **Local Dependencies**: Check absolute paths exist on filesystem
4374 ///
4375 /// # Performance
4376 ///
4377 /// Verification is designed to be fast:
4378 /// - No network operations (doesn't validate remote repositories)
4379 /// - No Git operations (doesn't check if versions exist)
4380 /// - Only filesystem access for absolute local paths
4381 ///
4382 /// # Parameters
4383 ///
4384 /// - `progress`: Optional progress bar for user feedback
4385 ///
4386 /// # Returns
4387 ///
4388 /// `Ok(())` if all dependencies pass basic validation.
4389 ///
4390 /// # Errors
4391 ///
4392 /// Verification fails if:
4393 /// - Local dependencies reference non-existent absolute paths
4394 /// - Dependencies reference undefined sources
4395 /// - Manifest structure is invalid or corrupted
4396 ///
4397 /// # Note
4398 ///
4399 /// Successful verification doesn't guarantee resolution will succeed,
4400 /// since network issues or missing versions can still cause failures.
4401 /// Use this method for fast validation before expensive resolution operations.
4402 pub fn verify(&mut self) -> Result<()> {
4403 // Try to resolve all dependencies
4404 let deps: Vec<(&str, std::borrow::Cow<'_, ResourceDependency>, crate::core::ResourceType)> =
4405 self.manifest.all_dependencies_with_types();
4406
4407 for (name, dep, _resource_type) in deps {
4408 if dep.is_local() {
4409 // Check if local path exists or is relative
4410 let path = Path::new(dep.get_path());
4411 if path.is_absolute() && !path.exists() {
4412 anyhow::bail!("Local dependency '{}' not found at: {}", name, path.display());
4413 }
4414 } else {
4415 // Verify source exists
4416 let source_name = dep.get_source().ok_or_else(|| AgpmError::ConfigError {
4417 message: format!("Dependency '{name}' has no source specified"),
4418 })?;
4419
4420 if !self.manifest.sources.contains_key(source_name) {
4421 anyhow::bail!(
4422 "Dependency '{name}' references undefined source: '{source_name}'"
4423 );
4424 }
4425 }
4426 }
4427
4428 // Progress bar completion is handled by the caller
4429
4430 Ok(())
4431 }
4432}
4433
4434#[cfg(test)]
4435mod tests {
4436 use super::*;
4437 use crate::manifest::DetailedDependency;
4438 use tempfile::TempDir;
4439
4440 #[test]
4441 fn test_resolver_new() {
4442 let manifest = Manifest::new();
4443 let temp_dir = TempDir::new().unwrap();
4444 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4445 let resolver = DependencyResolver::with_cache(manifest, cache);
4446
4447 assert_eq!(resolver.cache.get_cache_location(), temp_dir.path());
4448 }
4449
4450 #[tokio::test]
4451 async fn test_resolve_local_dependency() {
4452 let temp_dir = TempDir::new().unwrap();
4453 let mut manifest = Manifest::new();
4454 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
4455 manifest.add_dependency(
4456 "local-agent".to_string(),
4457 ResourceDependency::Simple("../agents/local.md".to_string()),
4458 true,
4459 );
4460
4461 // Create dummy file to allow transitive dependency extraction
4462 let agents_dir = temp_dir.path().parent().unwrap().join("agents");
4463 std::fs::create_dir_all(&agents_dir).unwrap();
4464 std::fs::write(agents_dir.join("local.md"), "# Local Agent").unwrap();
4465
4466 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4467 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4468
4469 let lockfile = resolver.resolve().await.unwrap();
4470 assert_eq!(lockfile.agents.len(), 1);
4471
4472 let entry = &lockfile.agents[0];
4473 assert_eq!(entry.name, "local-agent");
4474 assert_eq!(entry.path, "../agents/local.md");
4475 assert!(entry.source.is_none());
4476 assert!(entry.url.is_none());
4477 }
4478
4479 #[tokio::test]
4480 async fn test_pre_sync_sources() {
4481 // Skip test if git is not available
4482 if std::process::Command::new("git").arg("--version").output().is_err() {
4483 eprintln!("Skipping test: git not available");
4484 return;
4485 }
4486
4487 // Create a test Git repository with resources
4488 let temp_dir = TempDir::new().unwrap();
4489 let repo_dir = temp_dir.path().join("test-repo");
4490 std::fs::create_dir(&repo_dir).unwrap();
4491
4492 // Initialize git repo
4493 std::process::Command::new("git").args(["init"]).current_dir(&repo_dir).output().unwrap();
4494
4495 std::process::Command::new("git")
4496 .args(["config", "user.email", "test@example.com"])
4497 .current_dir(&repo_dir)
4498 .output()
4499 .unwrap();
4500
4501 std::process::Command::new("git")
4502 .args(["config", "user.name", "Test User"])
4503 .current_dir(&repo_dir)
4504 .output()
4505 .unwrap();
4506
4507 // Create test files
4508 std::fs::create_dir_all(repo_dir.join("agents")).unwrap();
4509 std::fs::write(repo_dir.join("agents/test.md"), "# Test Agent\n\nTest content").unwrap();
4510
4511 // Commit files
4512 std::process::Command::new("git")
4513 .args(["add", "."])
4514 .current_dir(&repo_dir)
4515 .output()
4516 .unwrap();
4517
4518 std::process::Command::new("git")
4519 .args(["commit", "-m", "Initial commit"])
4520 .current_dir(&repo_dir)
4521 .output()
4522 .unwrap();
4523
4524 std::process::Command::new("git")
4525 .args(["tag", "v1.0.0"])
4526 .current_dir(&repo_dir)
4527 .output()
4528 .unwrap();
4529
4530 // Create a manifest with a dependency from this source
4531 let mut manifest = Manifest::new();
4532 let source_url = format!("file://{}", repo_dir.display());
4533 manifest.add_source("test-source".to_string(), source_url.clone());
4534
4535 manifest.add_dependency(
4536 "test-agent".to_string(),
4537 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
4538 source: Some("test-source".to_string()),
4539 path: "agents/test.md".to_string(),
4540 version: Some("v1.0.0".to_string()),
4541 branch: None,
4542 rev: None,
4543 command: None,
4544 args: None,
4545 target: None,
4546 filename: None,
4547 dependencies: None,
4548 tool: Some("claude-code".to_string()),
4549 flatten: None,
4550 })),
4551 true,
4552 );
4553
4554 // Create resolver with test cache
4555 let cache_dir = TempDir::new().unwrap();
4556 let cache = Cache::with_dir(cache_dir.path().to_path_buf()).unwrap();
4557 let mut resolver = DependencyResolver::with_cache(manifest.clone(), cache);
4558
4559 // Prepare dependencies for pre-sync
4560 let deps: Vec<(String, ResourceDependency)> = manifest
4561 .all_dependencies()
4562 .into_iter()
4563 .map(|(name, dep)| (name.to_string(), dep.clone()))
4564 .collect();
4565
4566 // Call pre_sync_sources - this should clone the repository and prepare entries
4567 resolver.pre_sync_sources(&deps).await.unwrap();
4568
4569 // Verify that entries and repos are prepared
4570 assert!(
4571 resolver.version_resolver.pending_count() > 0,
4572 "Should have entries after pre-sync"
4573 );
4574
4575 let bare_repo = resolver.version_resolver.get_bare_repo_path("test-source");
4576 assert!(bare_repo.is_some(), "Should have bare repo path cached");
4577
4578 // Verify the repository exists in cache (uses normalized name)
4579 let cached_repo_path = resolver.cache.get_cache_location().join("sources");
4580
4581 // The cache normalizes the source name, so we check if any .git directory exists
4582 let mut found_repo = false;
4583 if let Ok(entries) = std::fs::read_dir(&cached_repo_path) {
4584 for entry in entries.flatten() {
4585 if let Some(name) = entry.file_name().to_str()
4586 && name.ends_with(".git")
4587 {
4588 found_repo = true;
4589 break;
4590 }
4591 }
4592 }
4593 assert!(found_repo, "Repository should be cloned to cache");
4594
4595 // Now call resolve_all() - it should work without cloning again
4596 resolver.version_resolver.resolve_all().await.unwrap();
4597
4598 // Verify resolution succeeded by checking we have resolved versions
4599 let all_resolved = resolver.version_resolver.get_all_resolved();
4600 assert!(!all_resolved.is_empty(), "Resolution should produce resolved versions");
4601
4602 // Check that v1.0.0 was resolved to a SHA
4603 let key = ("test-source".to_string(), "v1.0.0".to_string());
4604 assert!(all_resolved.contains_key(&key), "Should have resolved v1.0.0");
4605
4606 let sha = all_resolved.get(&key).unwrap();
4607 assert_eq!(sha.len(), 40, "SHA should be 40 characters");
4608 }
4609
4610 #[test]
4611 fn test_verify_missing_source() {
4612 let mut manifest = Manifest::new();
4613
4614 // Add dependency without corresponding source
4615 manifest.add_dependency(
4616 "remote-agent".to_string(),
4617 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
4618 source: Some("nonexistent".to_string()),
4619 path: "agents/test.md".to_string(),
4620 version: None,
4621 branch: None,
4622 rev: None,
4623 command: None,
4624 args: None,
4625 target: None,
4626 filename: None,
4627 dependencies: None,
4628 tool: Some("claude-code".to_string()),
4629 flatten: None,
4630 })),
4631 true,
4632 );
4633
4634 let temp_dir = TempDir::new().unwrap();
4635 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4636 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4637
4638 let result = resolver.verify();
4639 assert!(result.is_err());
4640 assert!(result.unwrap_err().to_string().contains("undefined source"));
4641 }
4642
4643 #[tokio::test]
4644 async fn test_resolve_with_source_dependency() {
4645 let temp_dir = TempDir::new().unwrap();
4646
4647 // Create a local mock git repository
4648 let source_dir = temp_dir.path().join("test-source");
4649 std::fs::create_dir_all(&source_dir).unwrap();
4650 std::process::Command::new("git")
4651 .args(["init"])
4652 .current_dir(&source_dir)
4653 .output()
4654 .expect("Failed to initialize git repository");
4655
4656 // Create the agents directory and test file
4657 let agents_dir = source_dir.join("agents");
4658 std::fs::create_dir_all(&agents_dir).unwrap();
4659 std::fs::write(agents_dir.join("test.md"), "# Test Agent").unwrap();
4660
4661 // Add and commit the file
4662 std::process::Command::new("git")
4663 .args(["add", "."])
4664 .current_dir(&source_dir)
4665 .output()
4666 .unwrap();
4667 std::process::Command::new("git")
4668 .args(["config", "user.email", "test@example.com"])
4669 .current_dir(&source_dir)
4670 .output()
4671 .unwrap();
4672 std::process::Command::new("git")
4673 .args(["config", "user.name", "Test User"])
4674 .current_dir(&source_dir)
4675 .output()
4676 .unwrap();
4677 std::process::Command::new("git")
4678 .args(["commit", "-m", "Initial commit"])
4679 .current_dir(&source_dir)
4680 .output()
4681 .unwrap();
4682
4683 // Create a tag for version
4684 std::process::Command::new("git")
4685 .args(["tag", "v1.0.0"])
4686 .current_dir(&source_dir)
4687 .output()
4688 .unwrap();
4689
4690 let mut manifest = Manifest::new();
4691 // Use file:// URL to ensure it's treated as a Git source, not a local path
4692 let source_url = format!("file://{}", source_dir.display());
4693 manifest.add_source("test".to_string(), source_url);
4694 manifest.add_dependency(
4695 "remote-agent".to_string(),
4696 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
4697 source: Some("test".to_string()),
4698 path: "agents/test.md".to_string(),
4699 version: Some("v1.0.0".to_string()),
4700 branch: None,
4701 rev: None,
4702 command: None,
4703 args: None,
4704 target: None,
4705 filename: None,
4706 dependencies: None,
4707 tool: Some("claude-code".to_string()),
4708 flatten: None,
4709 })),
4710 true,
4711 );
4712
4713 let cache_dir = temp_dir.path().join("cache");
4714 let cache = Cache::with_dir(cache_dir).unwrap();
4715 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4716
4717 // This should now succeed with the local repository
4718 let result = resolver.resolve().await;
4719 assert!(result.is_ok());
4720 }
4721
4722 #[tokio::test]
4723 async fn test_resolve_with_progress() {
4724 let temp_dir = TempDir::new().unwrap();
4725 let mut manifest = Manifest::new();
4726 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
4727 manifest.add_dependency(
4728 "local".to_string(),
4729 ResourceDependency::Simple("test.md".to_string()),
4730 true,
4731 );
4732
4733 // Create dummy file to allow transitive dependency extraction
4734 std::fs::write(temp_dir.path().join("test.md"), "# Test").unwrap();
4735
4736 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4737 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4738
4739 let lockfile = resolver.resolve().await.unwrap();
4740 assert_eq!(lockfile.agents.len(), 1);
4741 }
4742
4743 #[test]
4744 fn test_verify_with_progress() {
4745 let mut manifest = Manifest::new();
4746 manifest.add_source("test".to_string(), "https://github.com/test/repo.git".to_string());
4747 manifest.add_dependency(
4748 "agent".to_string(),
4749 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
4750 source: Some("test".to_string()),
4751 path: "agents/test.md".to_string(),
4752 version: None,
4753 branch: None,
4754 rev: None,
4755 command: None,
4756 args: None,
4757 target: None,
4758 filename: None,
4759 dependencies: None,
4760 tool: Some("claude-code".to_string()),
4761 flatten: None,
4762 })),
4763 true,
4764 );
4765
4766 let temp_dir = TempDir::new().unwrap();
4767 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4768 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4769
4770 let result = resolver.verify();
4771 assert!(result.is_ok());
4772 }
4773
4774 #[tokio::test]
4775 async fn test_resolve_with_git_ref() {
4776 let temp_dir = TempDir::new().unwrap();
4777
4778 // Create a local mock git repository
4779 let source_dir = temp_dir.path().join("test-source");
4780 std::fs::create_dir_all(&source_dir).unwrap();
4781 std::process::Command::new("git")
4782 .args(["init"])
4783 .current_dir(&source_dir)
4784 .output()
4785 .expect("Failed to initialize git repository");
4786
4787 // Configure git
4788 std::process::Command::new("git")
4789 .args(["config", "user.email", "test@example.com"])
4790 .current_dir(&source_dir)
4791 .output()
4792 .unwrap();
4793 std::process::Command::new("git")
4794 .args(["config", "user.name", "Test User"])
4795 .current_dir(&source_dir)
4796 .output()
4797 .unwrap();
4798
4799 // Create the agents directory and test file
4800 let agents_dir = source_dir.join("agents");
4801 std::fs::create_dir_all(&agents_dir).unwrap();
4802 std::fs::write(agents_dir.join("test.md"), "# Test Agent").unwrap();
4803
4804 // Add and commit the file
4805 std::process::Command::new("git")
4806 .args(["add", "."])
4807 .current_dir(&source_dir)
4808 .output()
4809 .unwrap();
4810 std::process::Command::new("git")
4811 .args(["commit", "-m", "Initial commit"])
4812 .current_dir(&source_dir)
4813 .output()
4814 .unwrap();
4815
4816 // Create main branch (git may have created master)
4817 std::process::Command::new("git")
4818 .args(["branch", "-M", "main"])
4819 .current_dir(&source_dir)
4820 .output()
4821 .unwrap();
4822
4823 let mut manifest = Manifest::new();
4824 // Use file:// URL to ensure it's treated as a Git source, not a local path
4825 let source_url = format!("file://{}", source_dir.display());
4826 manifest.add_source("test".to_string(), source_url);
4827 manifest.add_dependency(
4828 "git-agent".to_string(),
4829 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
4830 source: Some("test".to_string()),
4831 path: "agents/test.md".to_string(),
4832 version: None,
4833 branch: None,
4834 rev: None,
4835 command: None,
4836 args: None,
4837 target: None,
4838 filename: None,
4839 dependencies: None,
4840 tool: Some("claude-code".to_string()),
4841 flatten: None,
4842 })),
4843 true,
4844 );
4845
4846 let cache_dir = temp_dir.path().join("cache");
4847 let cache = Cache::with_dir(cache_dir).unwrap();
4848 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4849
4850 // This should now succeed with the local repository
4851 let result = resolver.resolve().await;
4852 if let Err(e) = &result {
4853 eprintln!("Test failed with error: {:#}", e);
4854 }
4855 assert!(result.is_ok());
4856 }
4857
4858 #[tokio::test]
4859 async fn test_new_with_global() {
4860 let manifest = Manifest::new();
4861 let cache = Cache::new().unwrap();
4862 let result = DependencyResolver::new_with_global(manifest, cache).await;
4863 assert!(result.is_ok());
4864 }
4865
4866 #[test]
4867 fn test_resolver_new_default() {
4868 let manifest = Manifest::new();
4869 let cache = Cache::new().unwrap();
4870 let result = DependencyResolver::new(manifest, cache);
4871 assert!(result.is_ok());
4872 }
4873
4874 #[tokio::test]
4875 async fn test_resolve_multiple_dependencies() {
4876 let temp_dir = TempDir::new().unwrap();
4877 let mut manifest = Manifest::new();
4878 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
4879 manifest.add_dependency(
4880 "agent1".to_string(),
4881 ResourceDependency::Simple("a1.md".to_string()),
4882 true,
4883 );
4884 manifest.add_dependency(
4885 "agent2".to_string(),
4886 ResourceDependency::Simple("a2.md".to_string()),
4887 true,
4888 );
4889 manifest.add_dependency(
4890 "snippet1".to_string(),
4891 ResourceDependency::Simple("s1.md".to_string()),
4892 false,
4893 );
4894
4895 // Create dummy files to allow transitive dependency extraction
4896 std::fs::write(temp_dir.path().join("a1.md"), "# Agent 1").unwrap();
4897 std::fs::write(temp_dir.path().join("a2.md"), "# Agent 2").unwrap();
4898 std::fs::write(temp_dir.path().join("s1.md"), "# Snippet 1").unwrap();
4899
4900 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4901 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4902
4903 let lockfile = resolver.resolve().await.unwrap();
4904 assert_eq!(lockfile.agents.len(), 2);
4905 assert_eq!(lockfile.snippets.len(), 1);
4906 }
4907
4908 #[test]
4909 fn test_verify_local_dependency() {
4910 let mut manifest = Manifest::new();
4911 manifest.add_dependency(
4912 "local-agent".to_string(),
4913 ResourceDependency::Simple("../local/agent.md".to_string()),
4914 true,
4915 );
4916
4917 let temp_dir = TempDir::new().unwrap();
4918 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4919 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4920
4921 let result = resolver.verify();
4922 assert!(result.is_ok());
4923 }
4924
4925 #[tokio::test]
4926 async fn test_resolve_with_empty_manifest() {
4927 let manifest = Manifest::new();
4928 let temp_dir = TempDir::new().unwrap();
4929 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4930 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4931
4932 let lockfile = resolver.resolve().await.unwrap();
4933 assert_eq!(lockfile.agents.len(), 0);
4934 assert_eq!(lockfile.snippets.len(), 0);
4935 assert_eq!(lockfile.sources.len(), 0);
4936 }
4937
4938 #[tokio::test]
4939 async fn test_resolve_with_custom_target() {
4940 let temp_dir = TempDir::new().unwrap();
4941 let mut manifest = Manifest::new();
4942 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
4943
4944 // Add local dependency with custom target
4945 manifest.add_dependency(
4946 "custom-agent".to_string(),
4947 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
4948 source: None,
4949 path: "../test.md".to_string(),
4950 version: None,
4951 branch: None,
4952 rev: None,
4953 command: None,
4954 args: None,
4955 target: Some("integrations/custom".to_string()),
4956 filename: None,
4957 dependencies: None,
4958 tool: Some("claude-code".to_string()),
4959 flatten: None,
4960 })),
4961 true,
4962 );
4963
4964 // Create dummy file to allow transitive dependency extraction
4965 std::fs::write(temp_dir.path().parent().unwrap().join("test.md"), "# Test").unwrap();
4966
4967 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
4968 let mut resolver = DependencyResolver::with_cache(manifest, cache);
4969
4970 let lockfile = resolver.resolve().await.unwrap();
4971 assert_eq!(lockfile.agents.len(), 1);
4972
4973 let agent = &lockfile.agents[0];
4974 assert_eq!(agent.name, "custom-agent");
4975 // Verify the custom target is relative to the default agents directory
4976 // Normalize path separators for cross-platform testing
4977 let normalized_path = normalize_path_for_storage(&agent.installed_at);
4978 assert!(normalized_path.contains(".claude/agents/integrations/custom"));
4979 // Path ../test.md extracts to test.md after stripping parent components
4980 assert_eq!(normalized_path, ".claude/agents/integrations/custom/test.md");
4981 }
4982
4983 #[tokio::test]
4984 async fn test_resolve_without_custom_target() {
4985 let temp_dir = TempDir::new().unwrap();
4986 let mut manifest = Manifest::new();
4987 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
4988
4989 // Add local dependency without custom target
4990 manifest.add_dependency(
4991 "standard-agent".to_string(),
4992 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
4993 source: None,
4994 path: "../test.md".to_string(),
4995 version: None,
4996 branch: None,
4997 rev: None,
4998 command: None,
4999 args: None,
5000 target: None,
5001 filename: None,
5002 dependencies: None,
5003 tool: Some("claude-code".to_string()),
5004 flatten: None,
5005 })),
5006 true,
5007 );
5008
5009 // Create dummy file to allow transitive dependency extraction
5010 std::fs::write(temp_dir.path().parent().unwrap().join("test.md"), "# Test").unwrap();
5011
5012 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5013 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5014
5015 let lockfile = resolver.resolve().await.unwrap();
5016 assert_eq!(lockfile.agents.len(), 1);
5017
5018 let agent = &lockfile.agents[0];
5019 assert_eq!(agent.name, "standard-agent");
5020 // Verify the default target is used
5021 // Normalize path separators for cross-platform testing
5022 let normalized_path = normalize_path_for_storage(&agent.installed_at);
5023 // Path ../test.md extracts to test.md after stripping parent components
5024 assert_eq!(normalized_path, ".claude/agents/test.md");
5025 }
5026
5027 #[tokio::test]
5028 async fn test_resolve_with_custom_filename() {
5029 let temp_dir = TempDir::new().unwrap();
5030 let mut manifest = Manifest::new();
5031 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5032
5033 // Add local dependency with custom filename
5034 manifest.add_dependency(
5035 "my-agent".to_string(),
5036 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5037 source: None,
5038 path: "../test.md".to_string(),
5039 version: None,
5040 branch: None,
5041 rev: None,
5042 command: None,
5043 args: None,
5044 target: None,
5045 filename: Some("ai-assistant.txt".to_string()),
5046 dependencies: None,
5047 tool: Some("claude-code".to_string()),
5048 flatten: None,
5049 })),
5050 true,
5051 );
5052
5053 // Create dummy file to allow transitive dependency extraction
5054 std::fs::write(temp_dir.path().parent().unwrap().join("test.md"), "# Test").unwrap();
5055
5056 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5057 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5058
5059 let lockfile = resolver.resolve().await.unwrap();
5060 assert_eq!(lockfile.agents.len(), 1);
5061
5062 let agent = &lockfile.agents[0];
5063 assert_eq!(agent.name, "my-agent");
5064 // Verify the custom filename is used
5065 // Normalize path separators for cross-platform testing
5066 let normalized_path = normalize_path_for_storage(&agent.installed_at);
5067 assert_eq!(normalized_path, ".claude/agents/ai-assistant.txt");
5068 }
5069
5070 #[tokio::test]
5071 async fn test_resolve_with_custom_filename_and_target() {
5072 let temp_dir = TempDir::new().unwrap();
5073 let mut manifest = Manifest::new();
5074 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5075
5076 // Add local dependency with both custom filename and target
5077 manifest.add_dependency(
5078 "special-tool".to_string(),
5079 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5080 source: None,
5081 path: "../test.md".to_string(),
5082 version: None,
5083 branch: None,
5084 rev: None,
5085 command: None,
5086 args: None,
5087 target: Some("tools/ai".to_string()),
5088 filename: Some("assistant.markdown".to_string()),
5089 dependencies: None,
5090 tool: Some("claude-code".to_string()),
5091 flatten: None,
5092 })),
5093 true,
5094 );
5095
5096 // Create dummy file to allow transitive dependency extraction
5097 std::fs::write(temp_dir.path().parent().unwrap().join("test.md"), "# Test").unwrap();
5098
5099 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5100 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5101
5102 let lockfile = resolver.resolve().await.unwrap();
5103 assert_eq!(lockfile.agents.len(), 1);
5104
5105 let agent = &lockfile.agents[0];
5106 assert_eq!(agent.name, "special-tool");
5107 // Verify both custom target and filename are used
5108 // Custom target is relative to default agents directory
5109 // Normalize path separators for cross-platform testing
5110 let normalized_path = normalize_path_for_storage(&agent.installed_at);
5111 assert_eq!(normalized_path, ".claude/agents/tools/ai/assistant.markdown");
5112 }
5113
5114 #[tokio::test]
5115 async fn test_resolve_script_with_custom_filename() {
5116 let temp_dir = TempDir::new().unwrap();
5117 let mut manifest = Manifest::new();
5118 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5119
5120 // Add script with custom filename (different extension)
5121 manifest.add_dependency(
5122 "analyzer".to_string(),
5123 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5124 source: None,
5125 path: "../scripts/data-analyzer-v3.py".to_string(),
5126 version: None,
5127 branch: None,
5128 rev: None,
5129 command: None,
5130 args: None,
5131 target: None,
5132 filename: Some("analyze.py".to_string()),
5133 dependencies: None,
5134 tool: Some("claude-code".to_string()),
5135 flatten: None,
5136 })),
5137 false, // script (not agent)
5138 );
5139
5140 // Create dummy script file to allow transitive dependency extraction
5141 let scripts_dir = temp_dir.path().parent().unwrap().join("scripts");
5142 std::fs::create_dir_all(&scripts_dir).unwrap();
5143 std::fs::write(scripts_dir.join("data-analyzer-v3.py"), "#!/usr/bin/env python3").unwrap();
5144
5145 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5146 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5147
5148 let lockfile = resolver.resolve().await.unwrap();
5149 // Scripts should be in snippets array for now (based on false flag)
5150 assert_eq!(lockfile.snippets.len(), 1);
5151
5152 let script = &lockfile.snippets[0];
5153 assert_eq!(script.name, "analyzer");
5154 // Verify custom filename is used (with custom extension)
5155 // Normalize path separators for cross-platform testing
5156 // Uses claude-code tool, so snippets go to .claude/snippets/
5157 let normalized_path = normalize_path_for_storage(&script.installed_at);
5158 assert_eq!(normalized_path, ".claude/snippets/analyze.py");
5159 }
5160
5161 // ============ NEW TESTS FOR UNCOVERED AREAS ============
5162
5163 // Disable pattern tests for now as they require changing directory which breaks parallel test safety
5164 // These tests would need to be rewritten to not use pattern dependencies or
5165 // the resolver would need to support absolute base paths for pattern resolution
5166
5167 #[tokio::test]
5168 async fn test_resolve_pattern_dependency_local() {
5169 let temp_dir = TempDir::new().unwrap();
5170 let project_dir = temp_dir.path();
5171
5172 // Create local agent files
5173 let agents_dir = project_dir.join("agents");
5174 std::fs::create_dir_all(&agents_dir).unwrap();
5175 std::fs::write(agents_dir.join("helper.md"), "# Helper Agent").unwrap();
5176 std::fs::write(agents_dir.join("assistant.md"), "# Assistant Agent").unwrap();
5177 std::fs::write(agents_dir.join("tester.md"), "# Tester Agent").unwrap();
5178
5179 // Create manifest with local pattern dependency
5180 let mut manifest = Manifest::new();
5181 manifest.manifest_dir = Some(project_dir.to_path_buf());
5182 manifest.add_dependency(
5183 "local-agents".to_string(),
5184 ResourceDependency::Simple(format!("{}/agents/*.md", project_dir.display())),
5185 true,
5186 );
5187
5188 // Create resolver and resolve dependencies
5189 let cache_dir = temp_dir.path().join("cache");
5190 let cache = Cache::with_dir(cache_dir).unwrap();
5191 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5192
5193 let lockfile = resolver.resolve().await.unwrap();
5194
5195 // Verify all agents were resolved
5196 assert_eq!(lockfile.agents.len(), 3);
5197 let agent_names: Vec<String> = lockfile.agents.iter().map(|a| a.name.clone()).collect();
5198 assert!(agent_names.contains(&"helper".to_string()));
5199 assert!(agent_names.contains(&"assistant".to_string()));
5200 assert!(agent_names.contains(&"tester".to_string()));
5201 }
5202
5203 #[tokio::test]
5204 async fn test_resolve_pattern_dependency_remote() {
5205 let temp_dir = TempDir::new().unwrap();
5206
5207 // Create a local mock git repository with pattern-matching files
5208 let source_dir = temp_dir.path().join("test-source");
5209 std::fs::create_dir_all(&source_dir).unwrap();
5210 std::process::Command::new("git")
5211 .args(["init"])
5212 .current_dir(&source_dir)
5213 .output()
5214 .expect("Failed to initialize git repository");
5215
5216 // Configure git
5217 std::process::Command::new("git")
5218 .args(["config", "user.email", "test@example.com"])
5219 .current_dir(&source_dir)
5220 .output()
5221 .unwrap();
5222 std::process::Command::new("git")
5223 .args(["config", "user.name", "Test User"])
5224 .current_dir(&source_dir)
5225 .output()
5226 .unwrap();
5227
5228 // Create multiple agent files
5229 let agents_dir = source_dir.join("agents");
5230 std::fs::create_dir_all(&agents_dir).unwrap();
5231 std::fs::write(agents_dir.join("python-linter.md"), "# Python Linter").unwrap();
5232 std::fs::write(agents_dir.join("python-formatter.md"), "# Python Formatter").unwrap();
5233 std::fs::write(agents_dir.join("rust-linter.md"), "# Rust Linter").unwrap();
5234
5235 // Add and commit
5236 std::process::Command::new("git")
5237 .args(["add", "."])
5238 .current_dir(&source_dir)
5239 .output()
5240 .unwrap();
5241 std::process::Command::new("git")
5242 .args(["commit", "-m", "Add agents"])
5243 .current_dir(&source_dir)
5244 .output()
5245 .unwrap();
5246
5247 // Create a tag
5248 std::process::Command::new("git")
5249 .args(["tag", "v1.0.0"])
5250 .current_dir(&source_dir)
5251 .output()
5252 .unwrap();
5253
5254 let mut manifest = Manifest::new();
5255 // Use file:// URL to ensure it's treated as a Git source, not a local path
5256 let source_url = format!("file://{}", source_dir.display());
5257 manifest.add_source("test".to_string(), source_url);
5258
5259 // Add pattern dependency for python agents
5260 manifest.add_dependency(
5261 "python-tools".to_string(),
5262 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5263 source: Some("test".to_string()),
5264 path: "agents/python-*.md".to_string(),
5265 version: Some("v1.0.0".to_string()),
5266 branch: None,
5267 rev: None,
5268 command: None,
5269 args: None,
5270 target: None,
5271 filename: None,
5272 dependencies: None,
5273 tool: Some("claude-code".to_string()),
5274 flatten: None,
5275 })),
5276 true, // agents
5277 );
5278
5279 let cache_dir = temp_dir.path().join("cache");
5280 let cache = Cache::with_dir(cache_dir).unwrap();
5281 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5282
5283 let lockfile = resolver.resolve().await.unwrap();
5284 // Should have resolved to 2 python agents
5285 assert_eq!(lockfile.agents.len(), 2);
5286
5287 // Check that both python agents were found
5288 let agent_names: Vec<String> = lockfile.agents.iter().map(|a| a.name.clone()).collect();
5289 assert!(agent_names.contains(&"python-linter".to_string()));
5290 assert!(agent_names.contains(&"python-formatter".to_string()));
5291 assert!(!agent_names.contains(&"rust-linter".to_string()));
5292 }
5293
5294 #[tokio::test]
5295 async fn test_resolve_pattern_dependency_with_custom_target() {
5296 let temp_dir = TempDir::new().unwrap();
5297 let project_dir = temp_dir.path();
5298
5299 // Create local agent files
5300 let agents_dir = project_dir.join("agents");
5301 std::fs::create_dir_all(&agents_dir).unwrap();
5302 std::fs::write(agents_dir.join("helper.md"), "# Helper Agent").unwrap();
5303 std::fs::write(agents_dir.join("assistant.md"), "# Assistant Agent").unwrap();
5304
5305 // Create manifest with local pattern dependency and custom target
5306 let mut manifest = Manifest::new();
5307 manifest.manifest_dir = Some(project_dir.to_path_buf());
5308 manifest.add_dependency(
5309 "custom-agents".to_string(),
5310 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5311 source: None,
5312 path: format!("{}/agents/*.md", project_dir.display()),
5313 version: None,
5314 branch: None,
5315 rev: None,
5316 command: None,
5317 args: None,
5318 target: Some("custom/agents".to_string()),
5319 filename: None,
5320 dependencies: None,
5321 tool: Some("claude-code".to_string()),
5322 flatten: None,
5323 })),
5324 true,
5325 );
5326
5327 // Create resolver and resolve dependencies
5328 let cache_dir = temp_dir.path().join("cache");
5329 let cache = Cache::with_dir(cache_dir).unwrap();
5330 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5331
5332 let lockfile = resolver.resolve().await.unwrap();
5333
5334 // Verify agents were resolved with custom target
5335 // Custom target is relative to default agents directory
5336 assert_eq!(lockfile.agents.len(), 2);
5337 for agent in &lockfile.agents {
5338 assert!(agent.installed_at.starts_with(".claude/agents/custom/agents/"));
5339 }
5340
5341 let agent_names: Vec<String> = lockfile.agents.iter().map(|a| a.name.clone()).collect();
5342 assert!(agent_names.contains(&"helper".to_string()));
5343 assert!(agent_names.contains(&"assistant".to_string()));
5344 }
5345
5346 #[tokio::test]
5347 async fn test_update_specific_dependencies() {
5348 let temp_dir = TempDir::new().unwrap();
5349
5350 // Create a local mock git repository
5351 let source_dir = temp_dir.path().join("test-source");
5352 std::fs::create_dir_all(&source_dir).unwrap();
5353 std::process::Command::new("git")
5354 .args(["init"])
5355 .current_dir(&source_dir)
5356 .output()
5357 .expect("Failed to initialize git repository");
5358
5359 // Configure git
5360 std::process::Command::new("git")
5361 .args(["config", "user.email", "test@example.com"])
5362 .current_dir(&source_dir)
5363 .output()
5364 .unwrap();
5365 std::process::Command::new("git")
5366 .args(["config", "user.name", "Test User"])
5367 .current_dir(&source_dir)
5368 .output()
5369 .unwrap();
5370
5371 // Create initial files
5372 let agents_dir = source_dir.join("agents");
5373 std::fs::create_dir_all(&agents_dir).unwrap();
5374 std::fs::write(agents_dir.join("agent1.md"), "# Agent 1 v1").unwrap();
5375 std::fs::write(agents_dir.join("agent2.md"), "# Agent 2 v1").unwrap();
5376
5377 // Initial commit
5378 std::process::Command::new("git")
5379 .args(["add", "."])
5380 .current_dir(&source_dir)
5381 .output()
5382 .unwrap();
5383 std::process::Command::new("git")
5384 .args(["commit", "-m", "Initial"])
5385 .current_dir(&source_dir)
5386 .output()
5387 .unwrap();
5388 std::process::Command::new("git")
5389 .args(["tag", "v1.0.0"])
5390 .current_dir(&source_dir)
5391 .output()
5392 .unwrap();
5393
5394 // Update agent1 and create v2.0.0
5395 std::fs::write(agents_dir.join("agent1.md"), "# Agent 1 v2").unwrap();
5396 std::process::Command::new("git")
5397 .args(["add", "."])
5398 .current_dir(&source_dir)
5399 .output()
5400 .unwrap();
5401 std::process::Command::new("git")
5402 .args(["commit", "-m", "Update agent1"])
5403 .current_dir(&source_dir)
5404 .output()
5405 .unwrap();
5406 std::process::Command::new("git")
5407 .args(["tag", "v2.0.0"])
5408 .current_dir(&source_dir)
5409 .output()
5410 .unwrap();
5411
5412 let mut manifest = Manifest::new();
5413 // Use file:// URL to ensure it's treated as a Git source, not a local path
5414 let source_url = format!("file://{}", source_dir.display());
5415 manifest.add_source("test".to_string(), source_url);
5416
5417 // Add dependencies - initially both at v1.0.0
5418 manifest.add_dependency(
5419 "agent1".to_string(),
5420 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5421 source: Some("test".to_string()),
5422 path: "agents/agent1.md".to_string(),
5423 version: Some("v1.0.0".to_string()), // Start with v1.0.0
5424 branch: None,
5425 rev: None,
5426 command: None,
5427 args: None,
5428 target: None,
5429 filename: None,
5430 dependencies: None,
5431 tool: Some("claude-code".to_string()),
5432 flatten: None,
5433 })),
5434 true,
5435 );
5436 manifest.add_dependency(
5437 "agent2".to_string(),
5438 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5439 source: Some("test".to_string()),
5440 path: "agents/agent2.md".to_string(),
5441 version: Some("v1.0.0".to_string()), // Start with v1.0.0
5442 branch: None,
5443 rev: None,
5444 command: None,
5445 args: None,
5446 target: None,
5447 filename: None,
5448 dependencies: None,
5449 tool: Some("claude-code".to_string()),
5450 flatten: None,
5451 })),
5452 true,
5453 );
5454
5455 let cache_dir = temp_dir.path().join("cache");
5456 let cache = Cache::with_dir(cache_dir.clone()).unwrap();
5457 let mut resolver = DependencyResolver::with_cache(manifest.clone(), cache);
5458
5459 // First resolve with v1.0.0 for both
5460 let initial_lockfile = resolver.resolve().await.unwrap();
5461 assert_eq!(initial_lockfile.agents.len(), 2);
5462
5463 // Create a new manifest with agent1 updated to v2.0.0
5464 let mut updated_manifest = Manifest::new();
5465 // Use file:// URL to ensure it's treated as a Git source, not a local path
5466 updated_manifest.add_source("test".to_string(), format!("file://{}", source_dir.display()));
5467 updated_manifest.add_dependency(
5468 "agent1".to_string(),
5469 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5470 source: Some("test".to_string()),
5471 path: "agents/agent1.md".to_string(),
5472 version: Some("v2.0.0".to_string()),
5473 branch: None,
5474 rev: None,
5475 command: None,
5476 args: None,
5477 target: None,
5478 filename: None,
5479 dependencies: None,
5480 tool: Some("claude-code".to_string()),
5481 flatten: None,
5482 })),
5483 true,
5484 );
5485 updated_manifest.add_dependency(
5486 "agent2".to_string(),
5487 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5488 source: Some("test".to_string()),
5489 path: "agents/agent2.md".to_string(),
5490 version: Some("v1.0.0".to_string()), // Keep v1.0.0
5491 branch: None,
5492 rev: None,
5493 command: None,
5494 args: None,
5495 target: None,
5496 filename: None,
5497 dependencies: None,
5498 tool: Some("claude-code".to_string()),
5499 flatten: None,
5500 })),
5501 true,
5502 );
5503
5504 // Now update only agent1
5505 let cache2 = Cache::with_dir(cache_dir).unwrap();
5506 let mut resolver2 = DependencyResolver::with_cache(updated_manifest, cache2);
5507 let updated_lockfile =
5508 resolver2.update(&initial_lockfile, Some(vec!["agent1".to_string()])).await.unwrap();
5509
5510 // agent1 should be updated to v2.0.0
5511 let agent1 = updated_lockfile
5512 .agents
5513 .iter()
5514 .find(|a| a.name == "agent1" && a.version.as_deref() == Some("v2.0.0"))
5515 .unwrap();
5516 assert_eq!(agent1.version.as_ref().unwrap(), "v2.0.0");
5517
5518 // agent2 should remain at v1.0.0
5519 let agent2 = updated_lockfile
5520 .agents
5521 .iter()
5522 .find(|a| a.name == "agent2" && a.version.as_deref() == Some("v1.0.0"))
5523 .unwrap();
5524 assert_eq!(agent2.version.as_ref().unwrap(), "v1.0.0");
5525 }
5526
5527 #[tokio::test]
5528 async fn test_update_all_dependencies() {
5529 let temp_dir = TempDir::new().unwrap();
5530 let mut manifest = Manifest::new();
5531 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5532 manifest.add_dependency(
5533 "local1".to_string(),
5534 ResourceDependency::Simple("../a1.md".to_string()),
5535 true,
5536 );
5537 manifest.add_dependency(
5538 "local2".to_string(),
5539 ResourceDependency::Simple("../a2.md".to_string()),
5540 true,
5541 );
5542
5543 // Create dummy files to allow transitive dependency extraction
5544 let parent = temp_dir.path().parent().unwrap();
5545 std::fs::write(parent.join("a1.md"), "# Agent 1").unwrap();
5546 std::fs::write(parent.join("a2.md"), "# Agent 2").unwrap();
5547
5548 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5549 let mut resolver = DependencyResolver::with_cache(manifest.clone(), cache);
5550
5551 // Initial resolve
5552 let initial_lockfile = resolver.resolve().await.unwrap();
5553 assert_eq!(initial_lockfile.agents.len(), 2);
5554
5555 // Update all (None means update all)
5556 let cache2 = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5557 let mut resolver2 = DependencyResolver::with_cache(manifest, cache2);
5558 let updated_lockfile = resolver2.update(&initial_lockfile, None).await.unwrap();
5559
5560 // All dependencies should be present
5561 assert_eq!(updated_lockfile.agents.len(), 2);
5562 }
5563
5564 // NOTE: Comprehensive integration tests for update() with transitive dependencies
5565 // are in tests/integration_incremental_add.rs. These provide end-to-end testing
5566 // of the incremental `agpm add dep` scenario which exercises the update() method.
5567
5568 #[tokio::test]
5569 async fn test_resolve_hooks_resource_type() {
5570 let temp_dir = TempDir::new().unwrap();
5571 let mut manifest = Manifest::new();
5572 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5573
5574 // Add hook dependencies
5575 manifest.hooks.insert(
5576 "pre-commit".to_string(),
5577 ResourceDependency::Simple("../hooks/pre-commit.json".to_string()),
5578 );
5579 manifest.hooks.insert(
5580 "post-commit".to_string(),
5581 ResourceDependency::Simple("../hooks/post-commit.json".to_string()),
5582 );
5583
5584 // Create dummy hook files to allow transitive dependency extraction
5585 let hooks_dir = temp_dir.path().parent().unwrap().join("hooks");
5586 std::fs::create_dir_all(&hooks_dir).unwrap();
5587 std::fs::write(hooks_dir.join("pre-commit.json"), "{}").unwrap();
5588 std::fs::write(hooks_dir.join("post-commit.json"), "{}").unwrap();
5589
5590 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5591 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5592
5593 let lockfile = resolver.resolve().await.unwrap();
5594 assert_eq!(lockfile.hooks.len(), 2);
5595
5596 // Check that hooks point to the config file where they're configured
5597 for hook in &lockfile.hooks {
5598 assert_eq!(
5599 hook.installed_at, ".claude/settings.local.json",
5600 "Hooks should reference the config file where they're configured"
5601 );
5602 }
5603 }
5604
5605 #[tokio::test]
5606 async fn test_resolve_scripts_resource_type() {
5607 let temp_dir = TempDir::new().unwrap();
5608 let mut manifest = Manifest::new();
5609 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5610
5611 // Add script dependencies
5612 manifest.scripts.insert(
5613 "build".to_string(),
5614 ResourceDependency::Simple("../scripts/build.sh".to_string()),
5615 );
5616 manifest.scripts.insert(
5617 "test".to_string(),
5618 ResourceDependency::Simple("../scripts/test.py".to_string()),
5619 );
5620
5621 // Create dummy script files to allow transitive dependency extraction
5622 let scripts_dir = temp_dir.path().parent().unwrap().join("scripts");
5623 std::fs::create_dir_all(&scripts_dir).unwrap();
5624 std::fs::write(scripts_dir.join("build.sh"), "#!/bin/bash").unwrap();
5625 std::fs::write(scripts_dir.join("test.py"), "#!/usr/bin/env python3").unwrap();
5626
5627 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5628 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5629
5630 let lockfile = resolver.resolve().await.unwrap();
5631 assert_eq!(lockfile.scripts.len(), 2);
5632
5633 // Check that scripts maintain their extensions
5634 let build_script = lockfile.scripts.iter().find(|s| s.name == "build").unwrap();
5635 assert!(build_script.installed_at.ends_with("build.sh"));
5636
5637 let test_script = lockfile.scripts.iter().find(|s| s.name == "test").unwrap();
5638 assert!(test_script.installed_at.ends_with("test.py"));
5639 }
5640
5641 #[tokio::test]
5642 async fn test_resolve_mcp_servers_resource_type() {
5643 let temp_dir = TempDir::new().unwrap();
5644 let mut manifest = Manifest::new();
5645 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5646
5647 // Add MCP server dependencies
5648 manifest.mcp_servers.insert(
5649 "filesystem".to_string(),
5650 ResourceDependency::Simple("../mcp/filesystem.json".to_string()),
5651 );
5652 manifest.mcp_servers.insert(
5653 "database".to_string(),
5654 ResourceDependency::Simple("../mcp/database.json".to_string()),
5655 );
5656
5657 // Create dummy MCP server files to allow transitive dependency extraction
5658 let mcp_dir = temp_dir.path().parent().unwrap().join("mcp");
5659 std::fs::create_dir_all(&mcp_dir).unwrap();
5660 std::fs::write(mcp_dir.join("filesystem.json"), "{}").unwrap();
5661 std::fs::write(mcp_dir.join("database.json"), "{}").unwrap();
5662
5663 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5664 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5665
5666 let lockfile = resolver.resolve().await.unwrap();
5667 assert_eq!(lockfile.mcp_servers.len(), 2);
5668
5669 // Check that MCP servers point to the config file where they're configured
5670 for server in &lockfile.mcp_servers {
5671 assert_eq!(
5672 server.installed_at, ".mcp.json",
5673 "MCP servers should reference the config file where they're configured"
5674 );
5675 }
5676 }
5677
5678 #[tokio::test]
5679 async fn test_resolve_commands_resource_type() {
5680 let temp_dir = TempDir::new().unwrap();
5681 let mut manifest = Manifest::new();
5682 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
5683
5684 // Add command dependencies
5685 manifest.commands.insert(
5686 "deploy".to_string(),
5687 ResourceDependency::Simple("../commands/deploy.md".to_string()),
5688 );
5689 manifest.commands.insert(
5690 "lint".to_string(),
5691 ResourceDependency::Simple("../commands/lint.md".to_string()),
5692 );
5693
5694 // Create dummy command files to allow transitive dependency extraction
5695 let commands_dir = temp_dir.path().parent().unwrap().join("commands");
5696 std::fs::create_dir_all(&commands_dir).unwrap();
5697 std::fs::write(commands_dir.join("deploy.md"), "# Deploy").unwrap();
5698 std::fs::write(commands_dir.join("lint.md"), "# Lint").unwrap();
5699
5700 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5701 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5702
5703 let lockfile = resolver.resolve().await.unwrap();
5704 assert_eq!(lockfile.commands.len(), 2);
5705
5706 // Check that commands are installed to the correct location
5707 for command in &lockfile.commands {
5708 // Normalize path separators for cross-platform testing
5709 let normalized_path = normalize_path_for_storage(&command.installed_at);
5710 assert!(normalized_path.contains(".claude/commands/"));
5711 assert!(command.installed_at.ends_with(".md"));
5712 }
5713 }
5714
5715 #[tokio::test]
5716 async fn test_checkout_version_with_constraint() {
5717 let temp_dir = TempDir::new().unwrap();
5718
5719 // Create a git repo with multiple version tags
5720 let source_dir = temp_dir.path().join("test-source");
5721 std::fs::create_dir_all(&source_dir).unwrap();
5722 std::process::Command::new("git")
5723 .args(["init"])
5724 .current_dir(&source_dir)
5725 .output()
5726 .expect("Failed to initialize git repository");
5727
5728 // Configure git
5729 std::process::Command::new("git")
5730 .args(["config", "user.email", "test@example.com"])
5731 .current_dir(&source_dir)
5732 .output()
5733 .unwrap();
5734 std::process::Command::new("git")
5735 .args(["config", "user.name", "Test User"])
5736 .current_dir(&source_dir)
5737 .output()
5738 .unwrap();
5739
5740 // Create file and make commits with version tags
5741 let test_file = source_dir.join("test.txt");
5742
5743 // v1.0.0
5744 std::fs::write(&test_file, "v1.0.0").unwrap();
5745 std::process::Command::new("git")
5746 .args(["add", "."])
5747 .current_dir(&source_dir)
5748 .output()
5749 .unwrap();
5750 std::process::Command::new("git")
5751 .args(["commit", "-m", "v1.0.0"])
5752 .current_dir(&source_dir)
5753 .output()
5754 .unwrap();
5755 std::process::Command::new("git")
5756 .args(["tag", "v1.0.0"])
5757 .current_dir(&source_dir)
5758 .output()
5759 .unwrap();
5760
5761 // v1.1.0
5762 std::fs::write(&test_file, "v1.1.0").unwrap();
5763 std::process::Command::new("git")
5764 .args(["add", "."])
5765 .current_dir(&source_dir)
5766 .output()
5767 .unwrap();
5768 std::process::Command::new("git")
5769 .args(["commit", "-m", "v1.1.0"])
5770 .current_dir(&source_dir)
5771 .output()
5772 .unwrap();
5773 std::process::Command::new("git")
5774 .args(["tag", "v1.1.0"])
5775 .current_dir(&source_dir)
5776 .output()
5777 .unwrap();
5778
5779 // v1.2.0
5780 std::fs::write(&test_file, "v1.2.0").unwrap();
5781 std::process::Command::new("git")
5782 .args(["add", "."])
5783 .current_dir(&source_dir)
5784 .output()
5785 .unwrap();
5786 std::process::Command::new("git")
5787 .args(["commit", "-m", "v1.2.0"])
5788 .current_dir(&source_dir)
5789 .output()
5790 .unwrap();
5791 std::process::Command::new("git")
5792 .args(["tag", "v1.2.0"])
5793 .current_dir(&source_dir)
5794 .output()
5795 .unwrap();
5796
5797 // v2.0.0
5798 std::fs::write(&test_file, "v2.0.0").unwrap();
5799 std::process::Command::new("git")
5800 .args(["add", "."])
5801 .current_dir(&source_dir)
5802 .output()
5803 .unwrap();
5804 std::process::Command::new("git")
5805 .args(["commit", "-m", "v2.0.0"])
5806 .current_dir(&source_dir)
5807 .output()
5808 .unwrap();
5809 std::process::Command::new("git")
5810 .args(["tag", "v2.0.0"])
5811 .current_dir(&source_dir)
5812 .output()
5813 .unwrap();
5814
5815 let mut manifest = Manifest::new();
5816 // Use file:// URL to ensure it's treated as a Git source, not a local path
5817 let source_url = format!("file://{}", source_dir.display());
5818 manifest.add_source("test".to_string(), source_url);
5819
5820 // Test version constraint resolution (^1.0.0 should resolve to 1.2.0)
5821 manifest.add_dependency(
5822 "constrained-dep".to_string(),
5823 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5824 source: Some("test".to_string()),
5825 path: "test.txt".to_string(),
5826 version: Some("^1.0.0".to_string()), // Constraint: compatible with 1.x.x
5827 branch: None,
5828 rev: None,
5829 command: None,
5830 args: None,
5831 target: None,
5832 filename: None,
5833 dependencies: None,
5834 tool: Some("claude-code".to_string()),
5835 flatten: None,
5836 })),
5837 true,
5838 );
5839
5840 let cache_dir = temp_dir.path().join("cache");
5841 let cache = Cache::with_dir(cache_dir).unwrap();
5842 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5843
5844 let lockfile = resolver.resolve().await.unwrap();
5845 assert_eq!(lockfile.agents.len(), 1);
5846
5847 let agent = &lockfile.agents[0];
5848 // Should resolve to highest 1.x version (1.2.0), not 2.0.0
5849 assert_eq!(agent.version.as_ref().unwrap(), "v1.2.0");
5850 }
5851
5852 #[tokio::test]
5853 async fn test_verify_absolute_path_error() {
5854 let mut manifest = Manifest::new();
5855
5856 // Add dependency with non-existent absolute path
5857 // Use platform-specific absolute path
5858 let nonexistent_path = if cfg!(windows) {
5859 "C:\\nonexistent\\path\\agent.md"
5860 } else {
5861 "/nonexistent/path/agent.md"
5862 };
5863
5864 manifest.add_dependency(
5865 "missing-agent".to_string(),
5866 ResourceDependency::Simple(nonexistent_path.to_string()),
5867 true,
5868 );
5869
5870 let temp_dir = TempDir::new().unwrap();
5871 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5872 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5873
5874 let result = resolver.verify();
5875 assert!(result.is_err());
5876 assert!(result.unwrap_err().to_string().contains("not found"));
5877 }
5878
5879 #[tokio::test]
5880 async fn test_resolve_pattern_dependency_error() {
5881 let mut manifest = Manifest::new();
5882
5883 // Add pattern dependency without source (should error in resolve_pattern_dependency)
5884 manifest.add_dependency(
5885 "pattern-dep".to_string(),
5886 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5887 source: Some("nonexistent".to_string()),
5888 path: "agents/*.md".to_string(), // Pattern path
5889 version: None,
5890 branch: None,
5891 rev: None,
5892 command: None,
5893 args: None,
5894 target: None,
5895 filename: None,
5896 dependencies: None,
5897 tool: Some("claude-code".to_string()),
5898 flatten: None,
5899 })),
5900 true,
5901 );
5902
5903 let temp_dir = TempDir::new().unwrap();
5904 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
5905 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5906
5907 let result = resolver.resolve().await;
5908 assert!(result.is_err());
5909 }
5910
5911 #[tokio::test]
5912 async fn test_checkout_version_with_branch() {
5913 let temp_dir = TempDir::new().unwrap();
5914
5915 // Create a git repo with a branch
5916 let source_dir = temp_dir.path().join("test-source");
5917 std::fs::create_dir_all(&source_dir).unwrap();
5918 std::process::Command::new("git")
5919 .args(["init"])
5920 .current_dir(&source_dir)
5921 .output()
5922 .expect("Failed to initialize git repository");
5923
5924 // Configure git
5925 std::process::Command::new("git")
5926 .args(["config", "user.email", "test@example.com"])
5927 .current_dir(&source_dir)
5928 .output()
5929 .unwrap();
5930 std::process::Command::new("git")
5931 .args(["config", "user.name", "Test User"])
5932 .current_dir(&source_dir)
5933 .output()
5934 .unwrap();
5935
5936 // Create initial commit on main
5937 let test_file = source_dir.join("test.txt");
5938 std::fs::write(&test_file, "main").unwrap();
5939 std::process::Command::new("git")
5940 .args(["add", "."])
5941 .current_dir(&source_dir)
5942 .output()
5943 .unwrap();
5944 std::process::Command::new("git")
5945 .args(["commit", "-m", "Initial"])
5946 .current_dir(&source_dir)
5947 .output()
5948 .unwrap();
5949
5950 // Create and switch to develop branch
5951 std::process::Command::new("git")
5952 .args(["checkout", "-b", "develop"])
5953 .current_dir(&source_dir)
5954 .output()
5955 .unwrap();
5956
5957 // Make a commit on develop
5958 std::fs::write(&test_file, "develop").unwrap();
5959 std::process::Command::new("git")
5960 .args(["add", "."])
5961 .current_dir(&source_dir)
5962 .output()
5963 .unwrap();
5964 std::process::Command::new("git")
5965 .args(["commit", "-m", "Develop commit"])
5966 .current_dir(&source_dir)
5967 .output()
5968 .unwrap();
5969
5970 let mut manifest = Manifest::new();
5971 // Use file:// URL to ensure it's treated as a Git source, not a local path
5972 let source_url = format!("file://{}", source_dir.display());
5973 manifest.add_source("test".to_string(), source_url);
5974
5975 // Test branch checkout
5976 manifest.add_dependency(
5977 "branch-dep".to_string(),
5978 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
5979 source: Some("test".to_string()),
5980 path: "test.txt".to_string(),
5981 version: Some("develop".to_string()), // Branch name
5982 branch: None,
5983 rev: None,
5984 command: None,
5985 args: None,
5986 target: None,
5987 filename: None,
5988 dependencies: None,
5989 tool: Some("claude-code".to_string()),
5990 flatten: None,
5991 })),
5992 true,
5993 );
5994
5995 let cache_dir = temp_dir.path().join("cache");
5996 let cache = Cache::with_dir(cache_dir).unwrap();
5997 let mut resolver = DependencyResolver::with_cache(manifest, cache);
5998
5999 let lockfile = resolver.resolve().await.unwrap();
6000 assert_eq!(lockfile.agents.len(), 1);
6001
6002 // Should have resolved to develop branch
6003 let agent = &lockfile.agents[0];
6004 assert!(agent.resolved_commit.is_some());
6005 }
6006
6007 #[tokio::test]
6008 async fn test_checkout_version_with_commit_hash() {
6009 let temp_dir = TempDir::new().unwrap();
6010
6011 // Create a git repo
6012 let source_dir = temp_dir.path().join("test-source");
6013 std::fs::create_dir_all(&source_dir).unwrap();
6014 std::process::Command::new("git")
6015 .args(["init"])
6016 .current_dir(&source_dir)
6017 .output()
6018 .expect("Failed to initialize git repository");
6019
6020 // Configure git
6021 std::process::Command::new("git")
6022 .args(["config", "user.email", "test@example.com"])
6023 .current_dir(&source_dir)
6024 .output()
6025 .unwrap();
6026 std::process::Command::new("git")
6027 .args(["config", "user.name", "Test User"])
6028 .current_dir(&source_dir)
6029 .output()
6030 .unwrap();
6031
6032 // Create a commit
6033 let test_file = source_dir.join("test.txt");
6034 std::fs::write(&test_file, "content").unwrap();
6035 std::process::Command::new("git")
6036 .args(["add", "."])
6037 .current_dir(&source_dir)
6038 .output()
6039 .unwrap();
6040 std::process::Command::new("git")
6041 .args(["commit", "-m", "Test commit"])
6042 .current_dir(&source_dir)
6043 .output()
6044 .unwrap();
6045
6046 // Get the commit hash
6047 let output = std::process::Command::new("git")
6048 .args(["rev-parse", "HEAD"])
6049 .current_dir(&source_dir)
6050 .output()
6051 .unwrap();
6052 let commit_hash = String::from_utf8_lossy(&output.stdout).trim().to_string();
6053
6054 let mut manifest = Manifest::new();
6055 // Use file:// URL to ensure it's treated as a Git source, not a local path
6056 let source_url = format!("file://{}", source_dir.display());
6057 manifest.add_source("test".to_string(), source_url);
6058
6059 // Test commit hash checkout (use first 7 chars for short hash)
6060 manifest.add_dependency(
6061 "commit-dep".to_string(),
6062 ResourceDependency::Detailed(Box::new(crate::manifest::DetailedDependency {
6063 source: Some("test".to_string()),
6064 path: "test.txt".to_string(),
6065 version: Some(commit_hash[..7].to_string()), // Short commit hash
6066 branch: None,
6067 rev: None,
6068 command: None,
6069 args: None,
6070 target: None,
6071 filename: None,
6072 dependencies: None,
6073 tool: Some("claude-code".to_string()),
6074 flatten: None,
6075 })),
6076 true,
6077 );
6078
6079 let cache_dir = temp_dir.path().join("cache");
6080 let cache = Cache::with_dir(cache_dir).unwrap();
6081 let mut resolver = DependencyResolver::with_cache(manifest, cache);
6082
6083 let lockfile = resolver.resolve().await.unwrap();
6084 assert_eq!(lockfile.agents.len(), 1);
6085
6086 let agent = &lockfile.agents[0];
6087 assert!(agent.resolved_commit.is_some());
6088 // The resolved commit should start with our short hash
6089 assert!(agent.resolved_commit.as_ref().unwrap().starts_with(&commit_hash[..7]));
6090 }
6091
6092 #[tokio::test]
6093 async fn test_mixed_resource_types() {
6094 let temp_dir = TempDir::new().unwrap();
6095 let mut manifest = Manifest::new();
6096 manifest.manifest_dir = Some(temp_dir.path().to_path_buf());
6097
6098 // Add various resource types
6099 manifest.add_dependency(
6100 "agent1".to_string(),
6101 ResourceDependency::Simple("../agents/a1.md".to_string()),
6102 true,
6103 );
6104
6105 manifest.scripts.insert(
6106 "build".to_string(),
6107 ResourceDependency::Simple("../scripts/build.sh".to_string()),
6108 );
6109
6110 manifest.hooks.insert(
6111 "pre-commit".to_string(),
6112 ResourceDependency::Simple("../hooks/pre-commit.json".to_string()),
6113 );
6114
6115 manifest.commands.insert(
6116 "deploy".to_string(),
6117 ResourceDependency::Simple("../commands/deploy.md".to_string()),
6118 );
6119
6120 manifest.mcp_servers.insert(
6121 "filesystem".to_string(),
6122 ResourceDependency::Simple("../mcp/filesystem.json".to_string()),
6123 );
6124
6125 // Create dummy files for all resource types to allow transitive dependency extraction
6126 let parent = temp_dir.path().parent().unwrap();
6127 std::fs::create_dir_all(parent.join("agents")).unwrap();
6128 std::fs::create_dir_all(parent.join("scripts")).unwrap();
6129 std::fs::create_dir_all(parent.join("hooks")).unwrap();
6130 std::fs::create_dir_all(parent.join("commands")).unwrap();
6131 std::fs::create_dir_all(parent.join("mcp")).unwrap();
6132 std::fs::write(parent.join("agents/a1.md"), "# Agent").unwrap();
6133 std::fs::write(parent.join("scripts/build.sh"), "#!/bin/bash").unwrap();
6134 std::fs::write(parent.join("hooks/pre-commit.json"), "{}").unwrap();
6135 std::fs::write(parent.join("commands/deploy.md"), "# Deploy").unwrap();
6136 std::fs::write(parent.join("mcp/filesystem.json"), "{}").unwrap();
6137
6138 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
6139 let mut resolver = DependencyResolver::with_cache(manifest, cache);
6140
6141 let lockfile = resolver.resolve().await.unwrap();
6142
6143 // Check all resource types are resolved
6144 assert_eq!(lockfile.agents.len(), 1);
6145 assert_eq!(lockfile.scripts.len(), 1);
6146 assert_eq!(lockfile.hooks.len(), 1);
6147 assert_eq!(lockfile.commands.len(), 1);
6148 assert_eq!(lockfile.mcp_servers.len(), 1);
6149 }
6150
6151 #[tokio::test]
6152 async fn test_resolve_version_conflict_semver_preference() {
6153 let manifest = Manifest::new();
6154 let temp_dir = TempDir::new().unwrap();
6155 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
6156 let resolver = DependencyResolver::with_cache(manifest, cache);
6157
6158 // Test: semver version preferred over git branch
6159 let existing_semver = ResourceDependency::Detailed(Box::new(DetailedDependency {
6160 source: Some("test".to_string()),
6161 path: "agents/test.md".to_string(),
6162 version: Some("v1.0.0".to_string()),
6163 branch: None,
6164 rev: None,
6165 command: None,
6166 args: None,
6167 target: None,
6168 filename: None,
6169 dependencies: None,
6170 tool: Some("claude-code".to_string()),
6171 flatten: None,
6172 }));
6173
6174 let new_branch = ResourceDependency::Detailed(Box::new(DetailedDependency {
6175 source: Some("test".to_string()),
6176 path: "agents/test.md".to_string(),
6177 version: None,
6178 branch: Some("main".to_string()),
6179 rev: None,
6180 command: None,
6181 args: None,
6182 target: None,
6183 filename: None,
6184 dependencies: None,
6185 tool: Some("claude-code".to_string()),
6186 flatten: None,
6187 }));
6188
6189 let result = resolver
6190 .resolve_version_conflict("test-agent", &existing_semver, &new_branch, "app1")
6191 .await;
6192 assert!(result.is_ok(), "Should succeed with semver preference");
6193 let resolved = result.unwrap();
6194 assert_eq!(
6195 resolved.get_version(),
6196 Some("v1.0.0"),
6197 "Should prefer semver version over branch"
6198 );
6199
6200 // Test reverse: git branch vs semver version
6201 let result2 = resolver
6202 .resolve_version_conflict("test-agent", &new_branch, &existing_semver, "app2")
6203 .await;
6204 assert!(result2.is_ok(), "Should succeed with semver preference");
6205 let resolved2 = result2.unwrap();
6206 assert_eq!(
6207 resolved2.get_version(),
6208 Some("v1.0.0"),
6209 "Should prefer semver version over branch (reversed order)"
6210 );
6211 }
6212
6213 #[tokio::test]
6214 async fn test_resolve_version_conflict_semver_comparison() {
6215 let manifest = Manifest::new();
6216 let temp_dir = TempDir::new().unwrap();
6217 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
6218 let resolver = DependencyResolver::with_cache(manifest, cache);
6219
6220 // Test: higher semver version wins
6221 let existing_v1 = ResourceDependency::Detailed(Box::new(DetailedDependency {
6222 source: Some("test".to_string()),
6223 path: "agents/test.md".to_string(),
6224 version: Some("v1.5.0".to_string()),
6225 branch: None,
6226 rev: None,
6227 command: None,
6228 args: None,
6229 target: None,
6230 filename: None,
6231 dependencies: None,
6232 tool: Some("claude-code".to_string()),
6233 flatten: None,
6234 }));
6235
6236 let new_v2 = ResourceDependency::Detailed(Box::new(DetailedDependency {
6237 source: Some("test".to_string()),
6238 path: "agents/test.md".to_string(),
6239 version: Some("v2.0.0".to_string()),
6240 branch: None,
6241 rev: None,
6242 command: None,
6243 args: None,
6244 target: None,
6245 filename: None,
6246 dependencies: None,
6247 tool: Some("claude-code".to_string()),
6248 flatten: None,
6249 }));
6250
6251 let result =
6252 resolver.resolve_version_conflict("test-agent", &existing_v1, &new_v2, "app1").await;
6253 assert!(result.is_ok(), "Should succeed with higher version");
6254 let resolved = result.unwrap();
6255 assert_eq!(resolved.get_version(), Some("v2.0.0"), "Should use higher semver version");
6256
6257 // Test reverse order
6258 let result2 =
6259 resolver.resolve_version_conflict("test-agent", &new_v2, &existing_v1, "app2").await;
6260 assert!(result2.is_ok(), "Should succeed with higher version");
6261 let resolved2 = result2.unwrap();
6262 assert_eq!(
6263 resolved2.get_version(),
6264 Some("v2.0.0"),
6265 "Should use higher semver version (reversed order)"
6266 );
6267 }
6268
6269 #[tokio::test]
6270 async fn test_resolve_version_conflict_git_refs() {
6271 let manifest = Manifest::new();
6272 let temp_dir = TempDir::new().unwrap();
6273 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
6274 let resolver = DependencyResolver::with_cache(manifest, cache);
6275
6276 // Test: git refs use alphabetical ordering
6277 let existing_main = ResourceDependency::Detailed(Box::new(DetailedDependency {
6278 source: Some("test".to_string()),
6279 path: "agents/test.md".to_string(),
6280 version: None,
6281 branch: Some("main".to_string()),
6282 rev: None,
6283 command: None,
6284 args: None,
6285 target: None,
6286 filename: None,
6287 dependencies: None,
6288 tool: Some("claude-code".to_string()),
6289 flatten: None,
6290 }));
6291
6292 let new_develop = ResourceDependency::Detailed(Box::new(DetailedDependency {
6293 source: Some("test".to_string()),
6294 path: "agents/test.md".to_string(),
6295 version: None,
6296 branch: Some("develop".to_string()),
6297 rev: None,
6298 command: None,
6299 args: None,
6300 target: None,
6301 filename: None,
6302 dependencies: None,
6303 tool: Some("claude-code".to_string()),
6304 flatten: None,
6305 }));
6306
6307 let result = resolver
6308 .resolve_version_conflict("test-agent", &existing_main, &new_develop, "app1")
6309 .await;
6310 assert!(result.is_ok(), "Should succeed with alphabetical ordering");
6311 let resolved = result.unwrap();
6312 assert_eq!(
6313 resolved.get_version(),
6314 Some("develop"),
6315 "Should use alphabetically first git ref"
6316 );
6317 }
6318
6319 #[tokio::test]
6320 async fn test_resolve_version_conflict_head_vs_specific() {
6321 let manifest = Manifest::new();
6322 let temp_dir = TempDir::new().unwrap();
6323 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
6324 let resolver = DependencyResolver::with_cache(manifest, cache);
6325
6326 // Test: specific version preferred over HEAD (None)
6327 let existing_head = ResourceDependency::Simple("agents/test.md".to_string());
6328
6329 let new_specific = ResourceDependency::Detailed(Box::new(DetailedDependency {
6330 source: Some("test".to_string()),
6331 path: "agents/test.md".to_string(),
6332 version: Some("v1.0.0".to_string()),
6333 branch: None,
6334 rev: None,
6335 command: None,
6336 args: None,
6337 target: None,
6338 filename: None,
6339 dependencies: None,
6340 tool: Some("claude-code".to_string()),
6341 flatten: None,
6342 }));
6343
6344 let result = resolver
6345 .resolve_version_conflict("test-agent", &existing_head, &new_specific, "app1")
6346 .await;
6347 assert!(result.is_ok(), "Should succeed with specific version");
6348 let resolved = result.unwrap();
6349 assert_eq!(
6350 resolved.get_version(),
6351 Some("v1.0.0"),
6352 "Should prefer specific version over HEAD"
6353 );
6354 }
6355
6356 #[test]
6357 fn test_generate_dependency_name_manifest_relative() {
6358 let manifest = Manifest::new();
6359 let temp_dir = TempDir::new().unwrap();
6360 let cache = Cache::with_dir(temp_dir.path().to_path_buf()).unwrap();
6361 let resolver = DependencyResolver::with_cache(manifest, cache);
6362
6363 // Regular relative paths - strip resource type directory
6364 assert_eq!(resolver.generate_dependency_name("agents/helper.md"), "helper");
6365 assert_eq!(resolver.generate_dependency_name("snippets/utils.md"), "utils");
6366
6367 // Cross-directory paths with ../ - keep all components to avoid collisions
6368 assert_eq!(resolver.generate_dependency_name("../shared/utils.md"), "../shared/utils");
6369 assert_eq!(resolver.generate_dependency_name("../other/utils.md"), "../other/utils");
6370
6371 // Ensure different cross-directory paths get different names
6372 let name1 = resolver.generate_dependency_name("../shared/foo.md");
6373 let name2 = resolver.generate_dependency_name("../other/foo.md");
6374 assert_ne!(name1, name2, "Different parent directories should produce different names");
6375 }
6376}