foundry_compilers/compile/
project.rs

1//! Manages compiling of a `Project`
2//!
3//! The compilation of a project is performed in several steps.
4//!
5//! First the project's dependency graph [`crate::Graph`] is constructed and all imported
6//! dependencies are resolved. The graph holds all the relationships between the files and their
7//! versions. From there the appropriate version set is derived
8//! [`crate::Graph`] which need to be compiled with different
9//! [`crate::compilers::solc::Solc`] versions.
10//!
11//! At this point we check if we need to compile a source file or whether we can reuse an _existing_
12//! `Artifact`. We don't to compile if:
13//!     - caching is enabled
14//!     - the file is **not** dirty
15//!     - the artifact for that file exists
16//!
17//! This concludes the preprocessing, and we now have either
18//!    - only `Source` files that need to be compiled
19//!    - only cached `Artifacts`, compilation can be skipped. This is considered an unchanged,
20//!      cached project
21//!    - Mix of both `Source` and `Artifacts`, only the `Source` files need to be compiled, the
22//!      `Artifacts` can be reused.
23//!
24//! The final step is invoking `Solc` via the standard JSON format.
25//!
26//! ### Notes on [Import Path Resolution](https://docs.soliditylang.org/en/develop/path-resolution.html#path-resolution)
27//!
28//! In order to be able to support reproducible builds on all platforms, the Solidity compiler has
29//! to abstract away the details of the filesystem where source files are stored. Paths used in
30//! imports must work the same way everywhere while the command-line interface must be able to work
31//! with platform-specific paths to provide good user experience. This section aims to explain in
32//! detail how Solidity reconciles these requirements.
33//!
34//! The compiler maintains an internal database (virtual filesystem or VFS for short) where each
35//! source unit is assigned a unique source unit name which is an opaque and unstructured
36//! identifier. When you use the import statement, you specify an import path that references a
37//! source unit name. If the compiler does not find any source unit name matching the import path in
38//! the VFS, it invokes the callback, which is responsible for obtaining the source code to be
39//! placed under that name.
40//!
41//! This becomes relevant when dealing with resolved imports
42//!
43//! #### Relative Imports
44//!
45//! ```solidity
46//! import "./math/math.sol";
47//! import "contracts/tokens/token.sol";
48//! ```
49//! In the above `./math/math.sol` and `contracts/tokens/token.sol` are import paths while the
50//! source unit names they translate to are `contracts/math/math.sol` and
51//! `contracts/tokens/token.sol` respectively.
52//!
53//! #### Direct Imports
54//!
55//! An import that does not start with `./` or `../` is a direct import.
56//!
57//! ```solidity
58//! import "/project/lib/util.sol";         // source unit name: /project/lib/util.sol
59//! import "lib/util.sol";                  // source unit name: lib/util.sol
60//! import "@openzeppelin/address.sol";     // source unit name: @openzeppelin/address.sol
61//! import "https://example.com/token.sol"; // source unit name: <https://example.com/token.sol>
62//! ```
63//!
64//! After applying any import remappings the import path simply becomes the source unit name.
65//!
66//! ##### Import Remapping
67//!
68//! ```solidity
69//! import "github.com/ethereum/dapp-bin/library/math.sol"; // source unit name: dapp-bin/library/math.sol
70//! ```
71//!
72//! If compiled with `solc github.com/ethereum/dapp-bin/=dapp-bin/` the compiler will look for the
73//! file in the VFS under `dapp-bin/library/math.sol`. If the file is not available there, the
74//! source unit name will be passed to the Host Filesystem Loader, which will then look in
75//! `/project/dapp-bin/library/iterable_mapping.sol`
76//!
77//!
78//! ### Caching and Change detection
79//!
80//! If caching is enabled in the [Project] a cache file will be created upon a successful solc
81//! build. The [cache file](crate::cache::CompilerCache) stores metadata for all the files that were
82//! provided to solc.
83//! For every file the cache file contains a dedicated [cache entry](crate::cache::CacheEntry),
84//! which represents the state of the file. A solidity file can contain several contracts, for every
85//! contract a separate [artifact](crate::Artifact) is emitted. Therefore the entry also tracks all
86//! artifacts emitted by a file. A solidity file can also be compiled with several solc versions.
87//!
88//! For example in `A(<=0.8.10) imports C(>0.4.0)` and
89//! `B(0.8.11) imports C(>0.4.0)`, both `A` and `B` import `C` but there's no solc version that's
90//! compatible with `A` and `B`, in which case two sets are compiled: [`A`, `C`] and [`B`, `C`].
91//! This is reflected in the cache entry which tracks the file's artifacts by version.
92//!
93//! The cache makes it possible to detect changes during recompilation, so that only the changed,
94//! dirty, files need to be passed to solc. A file will be considered as dirty if:
95//!   - the file is new, not included in the existing cache
96//!   - the file was modified since the last compiler run, detected by comparing content hashes
97//!   - any of the imported files is dirty
98//!   - the file's artifacts don't exist, were deleted.
99//!
100//! Recompiling a project with cache enabled detects all files that meet these criteria and provides
101//! solc with only these dirty files instead of the entire source set.
102
103use crate::{
104    artifact_output::Artifacts,
105    buildinfo::RawBuildInfo,
106    cache::ArtifactsCache,
107    compilers::{Compiler, CompilerInput, CompilerOutput, Language},
108    filter::SparseOutputFilter,
109    output::{AggregatedCompilerOutput, Builds},
110    report,
111    resolver::{GraphEdges, ResolvedSources},
112    ArtifactOutput, CompilerSettings, Graph, Project, ProjectCompileOutput, ProjectPathsConfig,
113    Sources,
114};
115use foundry_compilers_core::error::Result;
116use rayon::prelude::*;
117use semver::Version;
118use std::{
119    collections::{HashMap, HashSet},
120    fmt::Debug,
121    path::PathBuf,
122    time::Instant,
123};
124
125/// A set of different Solc installations with their version and the sources to be compiled
126pub(crate) type VersionedSources<'a, L, S> = HashMap<L, Vec<(Version, Sources, (&'a str, &'a S))>>;
127
128/// Invoked before the actual compiler invocation and can override the input.
129///
130/// Updates the list of identified cached mocks (if any) to be stored in cache and updates the
131/// compiler input.
132pub trait Preprocessor<C: Compiler>: Debug {
133    fn preprocess(
134        &self,
135        compiler: &C,
136        input: &mut C::Input,
137        paths: &ProjectPathsConfig<C::Language>,
138        mocks: &mut HashSet<PathBuf>,
139    ) -> Result<()>;
140}
141
142#[derive(Debug)]
143pub struct ProjectCompiler<
144    'a,
145    T: ArtifactOutput<CompilerContract = C::CompilerContract>,
146    C: Compiler,
147> {
148    /// Contains the relationship of the source files and their imports
149    edges: GraphEdges<C::Parser>,
150    project: &'a Project<C, T>,
151    /// A mapping from a source file path to the primary profile name selected for it.
152    primary_profiles: HashMap<PathBuf, &'a str>,
153    /// how to compile all the sources
154    sources: CompilerSources<'a, C::Language, C::Settings>,
155    /// Optional preprocessor
156    preprocessor: Option<Box<dyn Preprocessor<C>>>,
157}
158
159impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
160    ProjectCompiler<'a, T, C>
161{
162    /// Create a new `ProjectCompiler` to bootstrap the compilation process of the project's
163    /// sources.
164    pub fn new(project: &'a Project<C, T>) -> Result<Self> {
165        Self::with_sources(project, project.paths.read_input_files()?)
166    }
167
168    /// Bootstraps the compilation process by resolving the dependency graph of all sources and the
169    /// appropriate `Solc` -> `Sources` set as well as the compile mode to use (parallel,
170    /// sequential)
171    ///
172    /// Multiple (`Solc` -> `Sources`) pairs can be compiled in parallel if the `Project` allows
173    /// multiple `jobs`, see [`crate::Project::set_solc_jobs()`].
174    #[instrument(name = "ProjectCompiler::new", skip_all)]
175    pub fn with_sources(project: &'a Project<C, T>, mut sources: Sources) -> Result<Self> {
176        if let Some(filter) = &project.sparse_output {
177            sources.retain(|f, _| filter.is_match(f))
178        }
179        let graph = Graph::resolve_sources(&project.paths, sources)?;
180        let ResolvedSources { sources, primary_profiles, edges } =
181            graph.into_sources_by_version(project)?;
182
183        // If there are multiple different versions, and we can use multiple jobs we can compile
184        // them in parallel.
185        let jobs_cnt = || sources.values().map(|v| v.len()).sum::<usize>();
186        let sources = CompilerSources {
187            jobs: (project.solc_jobs > 1 && jobs_cnt() > 1).then_some(project.solc_jobs),
188            sources,
189        };
190
191        Ok(Self { edges, primary_profiles, project, sources, preprocessor: None })
192    }
193
194    pub fn with_preprocessor(self, preprocessor: impl Preprocessor<C> + 'static) -> Self {
195        Self { preprocessor: Some(Box::new(preprocessor)), ..self }
196    }
197
198    /// Compiles all the sources of the `Project` in the appropriate mode
199    ///
200    /// If caching is enabled, the sources are filtered and only _dirty_ sources are recompiled.
201    ///
202    /// The output of the compile process can be a mix of reused artifacts and freshly compiled
203    /// `Contract`s
204    ///
205    /// # Examples
206    /// ```no_run
207    /// use foundry_compilers::Project;
208    ///
209    /// let project = Project::builder().build(Default::default())?;
210    /// let output = project.compile()?;
211    /// # Ok::<(), Box<dyn std::error::Error>>(())
212    /// ```
213    #[instrument(name = "compile_project", skip_all)]
214    pub fn compile(self) -> Result<ProjectCompileOutput<C, T>> {
215        let slash_paths = self.project.slash_paths;
216
217        // drive the compiler statemachine to completion
218        let mut output = self.preprocess()?.compile()?.write_artifacts()?.write_cache()?;
219
220        if slash_paths {
221            // ensures we always use `/` paths
222            output.slash_paths();
223        }
224
225        Ok(output)
226    }
227
228    /// Does basic preprocessing
229    ///   - sets proper source unit names
230    ///   - check cache
231    #[instrument(skip_all)]
232    fn preprocess(self) -> Result<PreprocessedState<'a, T, C>> {
233        trace!("preprocessing");
234        let Self { edges, project, mut sources, primary_profiles, preprocessor } = self;
235
236        // convert paths on windows to ensure consistency with the `CompilerOutput` `solc` emits,
237        // which is unix style `/`
238        sources.slash_paths();
239
240        let mut cache = ArtifactsCache::new(project, edges, preprocessor.is_some())?;
241        // retain and compile only dirty sources and all their imports
242        sources.filter(&mut cache);
243
244        Ok(PreprocessedState { sources, cache, primary_profiles, preprocessor })
245    }
246}
247
248/// A series of states that comprise the [`ProjectCompiler::compile()`] state machine
249///
250/// The main reason is to debug all states individually
251#[derive(Debug)]
252struct PreprocessedState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
253{
254    /// Contains all the sources to compile.
255    sources: CompilerSources<'a, C::Language, C::Settings>,
256
257    /// Cache that holds `CacheEntry` objects if caching is enabled and the project is recompiled
258    cache: ArtifactsCache<'a, T, C>,
259
260    /// A mapping from a source file path to the primary profile name selected for it.
261    primary_profiles: HashMap<PathBuf, &'a str>,
262
263    /// Optional preprocessor
264    preprocessor: Option<Box<dyn Preprocessor<C>>>,
265}
266
267impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
268    PreprocessedState<'a, T, C>
269{
270    /// advance to the next state by compiling all sources
271    #[instrument(skip_all)]
272    fn compile(self) -> Result<CompiledState<'a, T, C>> {
273        trace!("compiling");
274        let PreprocessedState { sources, mut cache, primary_profiles, preprocessor } = self;
275
276        let mut output = sources.compile(&mut cache, preprocessor)?;
277
278        // source paths get stripped before handing them over to solc, so solc never uses absolute
279        // paths, instead `--base-path <root dir>` is set. this way any metadata that's derived from
280        // data (paths) is relative to the project dir and should be independent of the current OS
281        // disk. However internally we still want to keep absolute paths, so we join the
282        // contracts again
283        output.join_all(cache.project().root());
284
285        Ok(CompiledState { output, cache, primary_profiles })
286    }
287}
288
289/// Represents the state after `solc` was successfully invoked
290#[derive(Debug)]
291struct CompiledState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
292    output: AggregatedCompilerOutput<C>,
293    cache: ArtifactsCache<'a, T, C>,
294    primary_profiles: HashMap<PathBuf, &'a str>,
295}
296
297impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
298    CompiledState<'a, T, C>
299{
300    /// advance to the next state by handling all artifacts
301    ///
302    /// Writes all output contracts to disk if enabled in the `Project` and if the build was
303    /// successful
304    #[instrument(skip_all)]
305    fn write_artifacts(self) -> Result<ArtifactsState<'a, T, C>> {
306        let CompiledState { output, cache, primary_profiles } = self;
307
308        let project = cache.project();
309        let ctx = cache.output_ctx();
310        // write all artifacts via the handler but only if the build succeeded and project wasn't
311        // configured with `no_artifacts == true`
312        let compiled_artifacts = if project.no_artifacts {
313            project.artifacts_handler().output_to_artifacts(
314                &output.contracts,
315                &output.sources,
316                ctx,
317                &project.paths,
318                &primary_profiles,
319            )
320        } else if output.has_error(
321            &project.ignored_error_codes,
322            &project.ignored_file_paths,
323            &project.compiler_severity_filter,
324        ) {
325            trace!("skip writing cache file due to solc errors: {:?}", output.errors);
326            project.artifacts_handler().output_to_artifacts(
327                &output.contracts,
328                &output.sources,
329                ctx,
330                &project.paths,
331                &primary_profiles,
332            )
333        } else {
334            trace!(
335                "handling artifact output for {} contracts and {} sources",
336                output.contracts.len(),
337                output.sources.len()
338            );
339            // this emits the artifacts via the project's artifacts handler
340            let artifacts = project.artifacts_handler().on_output(
341                &output.contracts,
342                &output.sources,
343                &project.paths,
344                ctx,
345                &primary_profiles,
346            )?;
347
348            // emits all the build infos, if they exist
349            output.write_build_infos(project.build_info_path())?;
350
351            artifacts
352        };
353
354        Ok(ArtifactsState { output, cache, compiled_artifacts })
355    }
356}
357
358/// Represents the state after all artifacts were written to disk
359#[derive(Debug)]
360struct ArtifactsState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
361    output: AggregatedCompilerOutput<C>,
362    cache: ArtifactsCache<'a, T, C>,
363    compiled_artifacts: Artifacts<T::Artifact>,
364}
365
366impl<T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
367    ArtifactsState<'_, T, C>
368{
369    /// Writes the cache file
370    ///
371    /// this concludes the [`Project::compile()`] statemachine
372    #[instrument(skip_all)]
373    fn write_cache(self) -> Result<ProjectCompileOutput<C, T>> {
374        let ArtifactsState { output, cache, compiled_artifacts } = self;
375        let project = cache.project();
376        let ignored_error_codes = project.ignored_error_codes.clone();
377        let ignored_file_paths = project.ignored_file_paths.clone();
378        let compiler_severity_filter = project.compiler_severity_filter;
379        let has_error =
380            output.has_error(&ignored_error_codes, &ignored_file_paths, &compiler_severity_filter);
381        let skip_write_to_disk = project.no_artifacts || has_error;
382        trace!(has_error, project.no_artifacts, skip_write_to_disk, cache_path=?project.cache_path(),"prepare writing cache file");
383
384        let (cached_artifacts, cached_builds, edges) =
385            cache.consume(&compiled_artifacts, &output.build_infos, !skip_write_to_disk)?;
386
387        project.artifacts_handler().handle_cached_artifacts(&cached_artifacts)?;
388
389        let builds = Builds(
390            output
391                .build_infos
392                .iter()
393                .map(|build_info| (build_info.id.clone(), build_info.build_context.clone()))
394                .chain(cached_builds)
395                .map(|(id, context)| (id, context.with_joined_paths(project.paths.root.as_path())))
396                .collect(),
397        );
398
399        Ok(ProjectCompileOutput {
400            compiler_output: output,
401            compiled_artifacts,
402            cached_artifacts,
403            ignored_error_codes,
404            ignored_file_paths,
405            compiler_severity_filter,
406            builds,
407            edges,
408        })
409    }
410}
411
412/// Determines how the `solc <-> sources` pairs are executed.
413#[derive(Debug, Clone)]
414struct CompilerSources<'a, L, S> {
415    /// The sources to compile.
416    sources: VersionedSources<'a, L, S>,
417    /// The number of jobs to use for parallel compilation.
418    jobs: Option<usize>,
419}
420
421impl<L: Language, S: CompilerSettings> CompilerSources<'_, L, S> {
422    /// Converts all `\\` separators to `/`.
423    ///
424    /// This effectively ensures that `solc` can find imported files like `/src/Cheats.sol` in the
425    /// VFS (the `CompilerInput` as json) under `src/Cheats.sol`.
426    fn slash_paths(&mut self) {
427        #[cfg(windows)]
428        {
429            use path_slash::PathBufExt;
430
431            self.sources.values_mut().for_each(|versioned_sources| {
432                versioned_sources.iter_mut().for_each(|(_, sources, _)| {
433                    *sources = std::mem::take(sources)
434                        .into_iter()
435                        .map(|(path, source)| {
436                            (PathBuf::from(path.to_slash_lossy().as_ref()), source)
437                        })
438                        .collect()
439                })
440            });
441        }
442    }
443
444    /// Filters out all sources that don't need to be compiled, see [`ArtifactsCache::filter`]
445    #[instrument(name = "CompilerSources::filter", skip_all)]
446    fn filter<
447        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
448        C: Compiler<Language = L>,
449    >(
450        &mut self,
451        cache: &mut ArtifactsCache<'_, T, C>,
452    ) {
453        cache.remove_dirty_sources();
454        for versioned_sources in self.sources.values_mut() {
455            for (version, sources, (profile, _)) in versioned_sources {
456                trace!("Filtering {} sources for {}", sources.len(), version);
457                cache.filter(sources, version, profile);
458                trace!(
459                    "Detected {} sources to compile {:?}",
460                    sources.dirty().count(),
461                    sources.dirty_files().collect::<Vec<_>>()
462                );
463            }
464        }
465    }
466
467    /// Compiles all the files with `Solc`
468    fn compile<
469        C: Compiler<Language = L, Settings = S>,
470        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
471    >(
472        self,
473        cache: &mut ArtifactsCache<'_, T, C>,
474        preprocessor: Option<Box<dyn Preprocessor<C>>>,
475    ) -> Result<AggregatedCompilerOutput<C>> {
476        let project = cache.project();
477        let graph = cache.graph();
478
479        let jobs_cnt = self.jobs;
480
481        let sparse_output = SparseOutputFilter::new(project.sparse_output.as_deref());
482
483        // Include additional paths collected during graph resolution.
484        let mut include_paths = project.paths.include_paths.clone();
485        include_paths.extend(graph.include_paths().clone());
486
487        // Get current list of mocks from cache. This will be passed to preprocessors and updated
488        // accordingly, then set back in cache.
489        let mut mocks = cache.mocks();
490
491        let mut jobs = Vec::new();
492        for (language, versioned_sources) in self.sources {
493            for (version, sources, (profile, opt_settings)) in versioned_sources {
494                let mut opt_settings = opt_settings.clone();
495                if sources.is_empty() {
496                    // nothing to compile
497                    trace!("skip {} for empty sources set", version);
498                    continue;
499                }
500
501                // depending on the composition of the filtered sources, the output selection can be
502                // optimized
503                let actually_dirty =
504                    sparse_output.sparse_sources(&sources, &mut opt_settings, graph);
505
506                if actually_dirty.is_empty() {
507                    // nothing to compile for this particular language, all dirty files are in the
508                    // other language set
509                    trace!("skip {} run due to empty source set", version);
510                    continue;
511                }
512
513                trace!("calling {} with {} sources {:?}", version, sources.len(), sources.keys());
514
515                let settings = opt_settings
516                    .with_base_path(&project.paths.root)
517                    .with_allow_paths(&project.paths.allowed_paths)
518                    .with_include_paths(&include_paths)
519                    .with_remappings(&project.paths.remappings);
520
521                let mut input = C::Input::build(sources, settings, language, version.clone());
522
523                input.strip_prefix(project.paths.root.as_path());
524
525                if let Some(preprocessor) = preprocessor.as_ref() {
526                    preprocessor.preprocess(
527                        &project.compiler,
528                        &mut input,
529                        &project.paths,
530                        &mut mocks,
531                    )?;
532                }
533
534                jobs.push((input, profile, actually_dirty));
535            }
536        }
537
538        // Update cache with mocks updated by preprocessors.
539        cache.update_mocks(mocks);
540
541        let results = if let Some(num_jobs) = jobs_cnt {
542            compile_parallel(&project.compiler, jobs, num_jobs)
543        } else {
544            compile_sequential(&project.compiler, jobs)
545        }?;
546
547        let mut aggregated = AggregatedCompilerOutput::default();
548
549        for (input, mut output, profile, actually_dirty) in results {
550            let version = input.version();
551
552            // Mark all files as seen by the compiler
553            for file in &actually_dirty {
554                cache.compiler_seen(file);
555            }
556
557            let build_info = RawBuildInfo::new(&input, &output, project.build_info)?;
558
559            output.retain_files(
560                actually_dirty
561                    .iter()
562                    .map(|f| f.strip_prefix(project.paths.root.as_path()).unwrap_or(f)),
563            );
564            output.join_all(project.paths.root.as_path());
565
566            aggregated.extend(version.clone(), build_info, profile, output);
567        }
568
569        Ok(aggregated)
570    }
571}
572
573type CompilationResult<'a, I, E, C> = Result<Vec<(I, CompilerOutput<E, C>, &'a str, Vec<PathBuf>)>>;
574
575/// Compiles the input set sequentially and returns a [Vec] of outputs.
576fn compile_sequential<'a, C: Compiler>(
577    compiler: &C,
578    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
579) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
580    jobs.into_iter()
581        .map(|(input, profile, actually_dirty)| {
582            let start = Instant::now();
583            report::compiler_spawn(
584                &input.compiler_name(),
585                input.version(),
586                actually_dirty.as_slice(),
587            );
588            let output = compiler.compile(&input)?;
589            report::compiler_success(&input.compiler_name(), input.version(), &start.elapsed());
590
591            Ok((input, output, profile, actually_dirty))
592        })
593        .collect()
594}
595
596/// compiles the input set using `num_jobs` threads
597fn compile_parallel<'a, C: Compiler>(
598    compiler: &C,
599    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
600    num_jobs: usize,
601) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
602    // need to get the currently installed reporter before installing the pool, otherwise each new
603    // thread in the pool will get initialized with the default value of the `thread_local!`'s
604    // localkey. This way we keep access to the reporter in the rayon pool
605    let scoped_report = report::get_default(|reporter| reporter.clone());
606
607    // start a rayon threadpool that will execute all `Solc::compile()` processes
608    let pool = rayon::ThreadPoolBuilder::new().num_threads(num_jobs).build().unwrap();
609
610    pool.install(move || {
611        jobs.into_par_iter()
612            .map(move |(input, profile, actually_dirty)| {
613                // set the reporter on this thread
614                let _guard = report::set_scoped(&scoped_report);
615
616                let start = Instant::now();
617                report::compiler_spawn(
618                    &input.compiler_name(),
619                    input.version(),
620                    actually_dirty.as_slice(),
621                );
622                compiler.compile(&input).map(move |output| {
623                    report::compiler_success(
624                        &input.compiler_name(),
625                        input.version(),
626                        &start.elapsed(),
627                    );
628                    (input, output, profile, actually_dirty)
629                })
630            })
631            .collect()
632    })
633}
634
635#[cfg(test)]
636#[cfg(all(feature = "project-util", feature = "svm-solc"))]
637mod tests {
638    use std::path::Path;
639
640    use foundry_compilers_artifacts::output_selection::ContractOutputSelection;
641
642    use crate::{
643        compilers::multi::MultiCompiler, project_util::TempProject, ConfigurableArtifacts,
644        MinimalCombinedArtifacts, ProjectPathsConfig,
645    };
646
647    use super::*;
648
649    fn init_tracing() {
650        let _ = tracing_subscriber::fmt()
651            .with_env_filter(tracing_subscriber::EnvFilter::from_default_env())
652            .try_init()
653            .ok();
654    }
655
656    #[test]
657    fn can_preprocess() {
658        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
659        let project = Project::builder()
660            .paths(ProjectPathsConfig::dapptools(&root).unwrap())
661            .build(Default::default())
662            .unwrap();
663
664        let compiler = ProjectCompiler::new(&project).unwrap();
665        let prep = compiler.preprocess().unwrap();
666        let cache = prep.cache.as_cached().unwrap();
667        // ensure that we have exactly 3 empty entries which will be filled on compilation.
668        assert_eq!(cache.cache.files.len(), 3);
669        assert!(cache.cache.files.values().all(|v| v.artifacts.is_empty()));
670
671        let compiled = prep.compile().unwrap();
672        assert_eq!(compiled.output.contracts.files().count(), 3);
673    }
674
675    #[test]
676    fn can_detect_cached_files() {
677        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
678        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
679        let project = TempProject::<MultiCompiler, MinimalCombinedArtifacts>::new(paths).unwrap();
680
681        let compiled = project.compile().unwrap();
682        compiled.assert_success();
683
684        let inner = project.project();
685        let compiler = ProjectCompiler::new(inner).unwrap();
686        let prep = compiler.preprocess().unwrap();
687        assert!(prep.cache.as_cached().unwrap().dirty_sources.is_empty())
688    }
689
690    #[test]
691    fn can_recompile_with_optimized_output() {
692        let tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
693
694        tmp.add_source(
695            "A",
696            r#"
697    pragma solidity ^0.8.10;
698    import "./B.sol";
699    contract A {}
700   "#,
701        )
702        .unwrap();
703
704        tmp.add_source(
705            "B",
706            r#"
707    pragma solidity ^0.8.10;
708    contract B {
709        function hello() public {}
710    }
711    import "./C.sol";
712   "#,
713        )
714        .unwrap();
715
716        tmp.add_source(
717            "C",
718            r"
719    pragma solidity ^0.8.10;
720    contract C {
721            function hello() public {}
722    }
723   ",
724        )
725        .unwrap();
726        let compiled = tmp.compile().unwrap();
727        compiled.assert_success();
728
729        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
730
731        // modify A.sol
732        tmp.add_source(
733            "A",
734            r#"
735    pragma solidity ^0.8.10;
736    import "./B.sol";
737    contract A {
738        function testExample() public {}
739    }
740   "#,
741        )
742        .unwrap();
743
744        let compiler = ProjectCompiler::new(tmp.project()).unwrap();
745        let state = compiler.preprocess().unwrap();
746        let sources = &state.sources.sources;
747
748        let cache = state.cache.as_cached().unwrap();
749
750        // 2 clean sources
751        assert_eq!(cache.cache.artifacts_len(), 2);
752        assert!(cache.cache.all_artifacts_exist());
753        assert_eq!(cache.dirty_sources.len(), 1);
754
755        let len = sources.values().map(|v| v.len()).sum::<usize>();
756        // single solc
757        assert_eq!(len, 1);
758
759        let filtered = &sources.values().next().unwrap()[0].1;
760
761        // 3 contracts total
762        assert_eq!(filtered.0.len(), 3);
763        // A is modified
764        assert_eq!(filtered.dirty().count(), 1);
765        assert!(filtered.dirty_files().next().unwrap().ends_with("A.sol"));
766
767        let state = state.compile().unwrap();
768        assert_eq!(state.output.sources.len(), 1);
769        for (f, source) in state.output.sources.sources() {
770            if f.ends_with("A.sol") {
771                assert!(source.ast.is_some());
772            } else {
773                assert!(source.ast.is_none());
774            }
775        }
776
777        assert_eq!(state.output.contracts.len(), 1);
778        let (a, c) = state.output.contracts_iter().next().unwrap();
779        assert_eq!(a, "A");
780        assert!(c.abi.is_some() && c.evm.is_some());
781
782        let state = state.write_artifacts().unwrap();
783        assert_eq!(state.compiled_artifacts.as_ref().len(), 1);
784
785        let out = state.write_cache().unwrap();
786
787        let artifacts: Vec<_> = out.into_artifacts().collect();
788        assert_eq!(artifacts.len(), 3);
789        for (_, artifact) in artifacts {
790            let c = artifact.into_contract_bytecode();
791            assert!(c.abi.is_some() && c.bytecode.is_some() && c.deployed_bytecode.is_some());
792        }
793
794        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
795    }
796
797    #[test]
798    #[ignore]
799    fn can_compile_real_project() {
800        init_tracing();
801        let paths = ProjectPathsConfig::builder()
802            .root("../../foundry-integration-tests/testdata/solmate")
803            .build()
804            .unwrap();
805        let project = Project::builder().paths(paths).build(Default::default()).unwrap();
806        let compiler = ProjectCompiler::new(&project).unwrap();
807        let _out = compiler.compile().unwrap();
808    }
809
810    #[test]
811    fn extra_output_cached() {
812        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
813        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
814        let mut project = TempProject::<MultiCompiler>::new(paths).unwrap();
815
816        // Compile once without enabled extra output
817        project.compile().unwrap();
818
819        // Enable extra output of abi
820        project.project_mut().artifacts =
821            ConfigurableArtifacts::new([], [ContractOutputSelection::Abi]);
822
823        // Ensure that abi appears after compilation and that we didn't recompile anything
824        let abi_path = project.project().paths.artifacts.join("Dapp.sol/Dapp.abi.json");
825        assert!(!abi_path.exists());
826        let output = project.compile().unwrap();
827        assert!(output.compiler_output.is_empty());
828        assert!(abi_path.exists());
829    }
830
831    #[test]
832    fn can_compile_leftovers_after_sparse() {
833        let mut tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
834
835        tmp.add_source(
836            "A",
837            r#"
838pragma solidity ^0.8.10;
839import "./B.sol";
840contract A {}
841"#,
842        )
843        .unwrap();
844
845        tmp.add_source(
846            "B",
847            r#"
848pragma solidity ^0.8.10;
849contract B {}
850"#,
851        )
852        .unwrap();
853
854        tmp.project_mut().sparse_output = Some(Box::new(|f: &Path| f.ends_with("A.sol")));
855        let compiled = tmp.compile().unwrap();
856        compiled.assert_success();
857        assert_eq!(compiled.artifacts().count(), 1);
858
859        tmp.project_mut().sparse_output = None;
860        let compiled = tmp.compile().unwrap();
861        compiled.assert_success();
862        assert_eq!(compiled.artifacts().count(), 2);
863    }
864}