foundry_compilers/compile/
project.rs

1//! Manages compiling of a `Project`
2//!
3//! The compilation of a project is performed in several steps.
4//!
5//! First the project's dependency graph [`crate::Graph`] is constructed and all imported
6//! dependencies are resolved. The graph holds all the relationships between the files and their
7//! versions. From there the appropriate version set is derived
8//! [`crate::Graph`] which need to be compiled with different
9//! [`crate::compilers::solc::Solc`] versions.
10//!
11//! At this point we check if we need to compile a source file or whether we can reuse an _existing_
12//! `Artifact`. We don't to compile if:
13//!     - caching is enabled
14//!     - the file is **not** dirty
15//!     - the artifact for that file exists
16//!
17//! This concludes the preprocessing, and we now have either
18//!    - only `Source` files that need to be compiled
19//!    - only cached `Artifacts`, compilation can be skipped. This is considered an unchanged,
20//!      cached project
21//!    - Mix of both `Source` and `Artifacts`, only the `Source` files need to be compiled, the
22//!      `Artifacts` can be reused.
23//!
24//! The final step is invoking `Solc` via the standard JSON format.
25//!
26//! ### Notes on [Import Path Resolution](https://docs.soliditylang.org/en/develop/path-resolution.html#path-resolution)
27//!
28//! In order to be able to support reproducible builds on all platforms, the Solidity compiler has
29//! to abstract away the details of the filesystem where source files are stored. Paths used in
30//! imports must work the same way everywhere while the command-line interface must be able to work
31//! with platform-specific paths to provide good user experience. This section aims to explain in
32//! detail how Solidity reconciles these requirements.
33//!
34//! The compiler maintains an internal database (virtual filesystem or VFS for short) where each
35//! source unit is assigned a unique source unit name which is an opaque and unstructured
36//! identifier. When you use the import statement, you specify an import path that references a
37//! source unit name. If the compiler does not find any source unit name matching the import path in
38//! the VFS, it invokes the callback, which is responsible for obtaining the source code to be
39//! placed under that name.
40//!
41//! This becomes relevant when dealing with resolved imports
42//!
43//! #### Relative Imports
44//!
45//! ```solidity
46//! import "./math/math.sol";
47//! import "contracts/tokens/token.sol";
48//! ```
49//! In the above `./math/math.sol` and `contracts/tokens/token.sol` are import paths while the
50//! source unit names they translate to are `contracts/math/math.sol` and
51//! `contracts/tokens/token.sol` respectively.
52//!
53//! #### Direct Imports
54//!
55//! An import that does not start with `./` or `../` is a direct import.
56//!
57//! ```solidity
58//! import "/project/lib/util.sol";         // source unit name: /project/lib/util.sol
59//! import "lib/util.sol";                  // source unit name: lib/util.sol
60//! import "@openzeppelin/address.sol";     // source unit name: @openzeppelin/address.sol
61//! import "https://example.com/token.sol"; // source unit name: <https://example.com/token.sol>
62//! ```
63//!
64//! After applying any import remappings the import path simply becomes the source unit name.
65//!
66//! ##### Import Remapping
67//!
68//! ```solidity
69//! import "github.com/ethereum/dapp-bin/library/math.sol"; // source unit name: dapp-bin/library/math.sol
70//! ```
71//!
72//! If compiled with `solc github.com/ethereum/dapp-bin/=dapp-bin/` the compiler will look for the
73//! file in the VFS under `dapp-bin/library/math.sol`. If the file is not available there, the
74//! source unit name will be passed to the Host Filesystem Loader, which will then look in
75//! `/project/dapp-bin/library/iterable_mapping.sol`
76//!
77//!
78//! ### Caching and Change detection
79//!
80//! If caching is enabled in the [Project] a cache file will be created upon a successful solc
81//! build. The [cache file](crate::cache::CompilerCache) stores metadata for all the files that were
82//! provided to solc.
83//! For every file the cache file contains a dedicated [cache entry](crate::cache::CacheEntry),
84//! which represents the state of the file. A solidity file can contain several contracts, for every
85//! contract a separate [artifact](crate::Artifact) is emitted. Therefore the entry also tracks all
86//! artifacts emitted by a file. A solidity file can also be compiled with several solc versions.
87//!
88//! For example in `A(<=0.8.10) imports C(>0.4.0)` and
89//! `B(0.8.11) imports C(>0.4.0)`, both `A` and `B` import `C` but there's no solc version that's
90//! compatible with `A` and `B`, in which case two sets are compiled: [`A`, `C`] and [`B`, `C`].
91//! This is reflected in the cache entry which tracks the file's artifacts by version.
92//!
93//! The cache makes it possible to detect changes during recompilation, so that only the changed,
94//! dirty, files need to be passed to solc. A file will be considered as dirty if:
95//!   - the file is new, not included in the existing cache
96//!   - the file was modified since the last compiler run, detected by comparing content hashes
97//!   - any of the imported files is dirty
98//!   - the file's artifacts don't exist, were deleted.
99//!
100//! Recompiling a project with cache enabled detects all files that meet these criteria and provides
101//! solc with only these dirty files instead of the entire source set.
102
103use crate::{
104    artifact_output::Artifacts,
105    buildinfo::RawBuildInfo,
106    cache::ArtifactsCache,
107    compilers::{Compiler, CompilerInput, CompilerOutput, Language},
108    filter::SparseOutputFilter,
109    output::{AggregatedCompilerOutput, Builds},
110    report,
111    resolver::{GraphEdges, ResolvedSources},
112    ArtifactOutput, CompilerSettings, Graph, Project, ProjectCompileOutput, Sources,
113};
114use foundry_compilers_core::error::Result;
115use rayon::prelude::*;
116use semver::Version;
117use std::{collections::HashMap, path::PathBuf, time::Instant};
118
119/// A set of different Solc installations with their version and the sources to be compiled
120pub(crate) type VersionedSources<'a, L, S> = HashMap<L, Vec<(Version, Sources, (&'a str, &'a S))>>;
121
122#[derive(Debug)]
123pub struct ProjectCompiler<
124    'a,
125    T: ArtifactOutput<CompilerContract = C::CompilerContract>,
126    C: Compiler,
127> {
128    /// Contains the relationship of the source files and their imports
129    edges: GraphEdges<C::ParsedSource>,
130    project: &'a Project<C, T>,
131    /// A mapping from a source file path to the primary profile name selected for it.
132    primary_profiles: HashMap<PathBuf, &'a str>,
133    /// how to compile all the sources
134    sources: CompilerSources<'a, C::Language, C::Settings>,
135}
136
137impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
138    ProjectCompiler<'a, T, C>
139{
140    /// Create a new `ProjectCompiler` to bootstrap the compilation process of the project's
141    /// sources.
142    pub fn new(project: &'a Project<C, T>) -> Result<Self> {
143        Self::with_sources(project, project.paths.read_input_files()?)
144    }
145
146    /// Bootstraps the compilation process by resolving the dependency graph of all sources and the
147    /// appropriate `Solc` -> `Sources` set as well as the compile mode to use (parallel,
148    /// sequential)
149    ///
150    /// Multiple (`Solc` -> `Sources`) pairs can be compiled in parallel if the `Project` allows
151    /// multiple `jobs`, see [`crate::Project::set_solc_jobs()`].
152    pub fn with_sources(project: &'a Project<C, T>, mut sources: Sources) -> Result<Self> {
153        if let Some(filter) = &project.sparse_output {
154            sources.retain(|f, _| filter.is_match(f))
155        }
156        let graph = Graph::resolve_sources(&project.paths, sources)?;
157        let ResolvedSources { sources, primary_profiles, edges } =
158            graph.into_sources_by_version(project)?;
159
160        // If there are multiple different versions, and we can use multiple jobs we can compile
161        // them in parallel.
162        let jobs_cnt = || sources.values().map(|v| v.len()).sum::<usize>();
163        let sources = CompilerSources {
164            jobs: (project.solc_jobs > 1 && jobs_cnt() > 1).then_some(project.solc_jobs),
165            sources,
166        };
167
168        Ok(Self { edges, primary_profiles, project, sources })
169    }
170
171    /// Compiles all the sources of the `Project` in the appropriate mode
172    ///
173    /// If caching is enabled, the sources are filtered and only _dirty_ sources are recompiled.
174    ///
175    /// The output of the compile process can be a mix of reused artifacts and freshly compiled
176    /// `Contract`s
177    ///
178    /// # Examples
179    /// ```no_run
180    /// use foundry_compilers::Project;
181    ///
182    /// let project = Project::builder().build(Default::default())?;
183    /// let output = project.compile()?;
184    /// # Ok::<(), Box<dyn std::error::Error>>(())
185    /// ```
186    pub fn compile(self) -> Result<ProjectCompileOutput<C, T>> {
187        let slash_paths = self.project.slash_paths;
188
189        // drive the compiler statemachine to completion
190        let mut output = self.preprocess()?.compile()?.write_artifacts()?.write_cache()?;
191
192        if slash_paths {
193            // ensures we always use `/` paths
194            output.slash_paths();
195        }
196
197        Ok(output)
198    }
199
200    /// Does basic preprocessing
201    ///   - sets proper source unit names
202    ///   - check cache
203    fn preprocess(self) -> Result<PreprocessedState<'a, T, C>> {
204        trace!("preprocessing");
205        let Self { edges, project, mut sources, primary_profiles } = self;
206
207        // convert paths on windows to ensure consistency with the `CompilerOutput` `solc` emits,
208        // which is unix style `/`
209        sources.slash_paths();
210
211        let mut cache = ArtifactsCache::new(project, edges)?;
212        // retain and compile only dirty sources and all their imports
213        sources.filter(&mut cache);
214
215        Ok(PreprocessedState { sources, cache, primary_profiles })
216    }
217}
218
219/// A series of states that comprise the [`ProjectCompiler::compile()`] state machine
220///
221/// The main reason is to debug all states individually
222#[derive(Debug)]
223struct PreprocessedState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
224{
225    /// Contains all the sources to compile.
226    sources: CompilerSources<'a, C::Language, C::Settings>,
227
228    /// Cache that holds `CacheEntry` objects if caching is enabled and the project is recompiled
229    cache: ArtifactsCache<'a, T, C>,
230
231    /// A mapping from a source file path to the primary profile name selected for it.
232    primary_profiles: HashMap<PathBuf, &'a str>,
233}
234
235impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
236    PreprocessedState<'a, T, C>
237{
238    /// advance to the next state by compiling all sources
239    fn compile(self) -> Result<CompiledState<'a, T, C>> {
240        trace!("compiling");
241        let PreprocessedState { sources, mut cache, primary_profiles } = self;
242
243        let mut output = sources.compile(&mut cache)?;
244
245        // source paths get stripped before handing them over to solc, so solc never uses absolute
246        // paths, instead `--base-path <root dir>` is set. this way any metadata that's derived from
247        // data (paths) is relative to the project dir and should be independent of the current OS
248        // disk. However internally we still want to keep absolute paths, so we join the
249        // contracts again
250        output.join_all(cache.project().root());
251
252        Ok(CompiledState { output, cache, primary_profiles })
253    }
254}
255
256/// Represents the state after `solc` was successfully invoked
257#[derive(Debug)]
258struct CompiledState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
259    output: AggregatedCompilerOutput<C>,
260    cache: ArtifactsCache<'a, T, C>,
261    primary_profiles: HashMap<PathBuf, &'a str>,
262}
263
264impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
265    CompiledState<'a, T, C>
266{
267    /// advance to the next state by handling all artifacts
268    ///
269    /// Writes all output contracts to disk if enabled in the `Project` and if the build was
270    /// successful
271    #[instrument(skip_all, name = "write-artifacts")]
272    fn write_artifacts(self) -> Result<ArtifactsState<'a, T, C>> {
273        let CompiledState { output, cache, primary_profiles } = self;
274
275        let project = cache.project();
276        let ctx = cache.output_ctx();
277        // write all artifacts via the handler but only if the build succeeded and project wasn't
278        // configured with `no_artifacts == true`
279        let compiled_artifacts = if project.no_artifacts {
280            project.artifacts_handler().output_to_artifacts(
281                &output.contracts,
282                &output.sources,
283                ctx,
284                &project.paths,
285                &primary_profiles,
286            )
287        } else if output.has_error(
288            &project.ignored_error_codes,
289            &project.ignored_file_paths,
290            &project.compiler_severity_filter,
291        ) {
292            trace!("skip writing cache file due to solc errors: {:?}", output.errors);
293            project.artifacts_handler().output_to_artifacts(
294                &output.contracts,
295                &output.sources,
296                ctx,
297                &project.paths,
298                &primary_profiles,
299            )
300        } else {
301            trace!(
302                "handling artifact output for {} contracts and {} sources",
303                output.contracts.len(),
304                output.sources.len()
305            );
306            // this emits the artifacts via the project's artifacts handler
307            let artifacts = project.artifacts_handler().on_output(
308                &output.contracts,
309                &output.sources,
310                &project.paths,
311                ctx,
312                &primary_profiles,
313            )?;
314
315            // emits all the build infos, if they exist
316            output.write_build_infos(project.build_info_path())?;
317
318            artifacts
319        };
320
321        Ok(ArtifactsState { output, cache, compiled_artifacts })
322    }
323}
324
325/// Represents the state after all artifacts were written to disk
326#[derive(Debug)]
327struct ArtifactsState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
328    output: AggregatedCompilerOutput<C>,
329    cache: ArtifactsCache<'a, T, C>,
330    compiled_artifacts: Artifacts<T::Artifact>,
331}
332
333impl<T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
334    ArtifactsState<'_, T, C>
335{
336    /// Writes the cache file
337    ///
338    /// this concludes the [`Project::compile()`] statemachine
339    fn write_cache(self) -> Result<ProjectCompileOutput<C, T>> {
340        let ArtifactsState { output, cache, compiled_artifacts } = self;
341        let project = cache.project();
342        let ignored_error_codes = project.ignored_error_codes.clone();
343        let ignored_file_paths = project.ignored_file_paths.clone();
344        let compiler_severity_filter = project.compiler_severity_filter;
345        let has_error =
346            output.has_error(&ignored_error_codes, &ignored_file_paths, &compiler_severity_filter);
347        let skip_write_to_disk = project.no_artifacts || has_error;
348        trace!(has_error, project.no_artifacts, skip_write_to_disk, cache_path=?project.cache_path(),"prepare writing cache file");
349
350        let (cached_artifacts, cached_builds) =
351            cache.consume(&compiled_artifacts, &output.build_infos, !skip_write_to_disk)?;
352
353        project.artifacts_handler().handle_cached_artifacts(&cached_artifacts)?;
354
355        let builds = Builds(
356            output
357                .build_infos
358                .iter()
359                .map(|build_info| (build_info.id.clone(), build_info.build_context.clone()))
360                .chain(cached_builds)
361                .map(|(id, context)| (id, context.with_joined_paths(project.paths.root.as_path())))
362                .collect(),
363        );
364
365        Ok(ProjectCompileOutput {
366            compiler_output: output,
367            compiled_artifacts,
368            cached_artifacts,
369            ignored_error_codes,
370            ignored_file_paths,
371            compiler_severity_filter,
372            builds,
373        })
374    }
375}
376
377/// Determines how the `solc <-> sources` pairs are executed.
378#[derive(Debug, Clone)]
379struct CompilerSources<'a, L, S> {
380    /// The sources to compile.
381    sources: VersionedSources<'a, L, S>,
382    /// The number of jobs to use for parallel compilation.
383    jobs: Option<usize>,
384}
385
386impl<L: Language, S: CompilerSettings> CompilerSources<'_, L, S> {
387    /// Converts all `\\` separators to `/`.
388    ///
389    /// This effectively ensures that `solc` can find imported files like `/src/Cheats.sol` in the
390    /// VFS (the `CompilerInput` as json) under `src/Cheats.sol`.
391    fn slash_paths(&mut self) {
392        #[cfg(windows)]
393        {
394            use path_slash::PathBufExt;
395
396            self.sources.values_mut().for_each(|versioned_sources| {
397                versioned_sources.iter_mut().for_each(|(_, sources, _)| {
398                    *sources = std::mem::take(sources)
399                        .into_iter()
400                        .map(|(path, source)| {
401                            (PathBuf::from(path.to_slash_lossy().as_ref()), source)
402                        })
403                        .collect()
404                })
405            });
406        }
407    }
408
409    /// Filters out all sources that don't need to be compiled, see [`ArtifactsCache::filter`]
410    fn filter<
411        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
412        C: Compiler<Language = L>,
413    >(
414        &mut self,
415        cache: &mut ArtifactsCache<'_, T, C>,
416    ) {
417        cache.remove_dirty_sources();
418        for versioned_sources in self.sources.values_mut() {
419            for (version, sources, (profile, _)) in versioned_sources {
420                trace!("Filtering {} sources for {}", sources.len(), version);
421                cache.filter(sources, version, profile);
422                trace!(
423                    "Detected {} sources to compile {:?}",
424                    sources.dirty().count(),
425                    sources.dirty_files().collect::<Vec<_>>()
426                );
427            }
428        }
429    }
430
431    /// Compiles all the files with `Solc`
432    fn compile<
433        C: Compiler<Language = L, Settings = S>,
434        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
435    >(
436        self,
437        cache: &mut ArtifactsCache<'_, T, C>,
438    ) -> Result<AggregatedCompilerOutput<C>> {
439        let project = cache.project();
440        let graph = cache.graph();
441
442        let jobs_cnt = self.jobs;
443
444        let sparse_output = SparseOutputFilter::new(project.sparse_output.as_deref());
445
446        // Include additional paths collected during graph resolution.
447        let mut include_paths = project.paths.include_paths.clone();
448        include_paths.extend(graph.include_paths().clone());
449
450        let mut jobs = Vec::new();
451        for (language, versioned_sources) in self.sources {
452            for (version, sources, (profile, opt_settings)) in versioned_sources {
453                let mut opt_settings = opt_settings.clone();
454                if sources.is_empty() {
455                    // nothing to compile
456                    trace!("skip {} for empty sources set", version);
457                    continue;
458                }
459
460                // depending on the composition of the filtered sources, the output selection can be
461                // optimized
462                let actually_dirty =
463                    sparse_output.sparse_sources(&sources, &mut opt_settings, graph);
464
465                if actually_dirty.is_empty() {
466                    // nothing to compile for this particular language, all dirty files are in the
467                    // other language set
468                    trace!("skip {} run due to empty source set", version);
469                    continue;
470                }
471
472                trace!("calling {} with {} sources {:?}", version, sources.len(), sources.keys());
473
474                let settings = opt_settings
475                    .with_base_path(&project.paths.root)
476                    .with_allow_paths(&project.paths.allowed_paths)
477                    .with_include_paths(&include_paths)
478                    .with_remappings(&project.paths.remappings);
479
480                let mut input = C::Input::build(sources, settings, language, version.clone());
481
482                input.strip_prefix(project.paths.root.as_path());
483
484                jobs.push((input, profile, actually_dirty));
485            }
486        }
487
488        let results = if let Some(num_jobs) = jobs_cnt {
489            compile_parallel(&project.compiler, jobs, num_jobs)
490        } else {
491            compile_sequential(&project.compiler, jobs)
492        }?;
493
494        let mut aggregated = AggregatedCompilerOutput::default();
495
496        for (input, mut output, profile, actually_dirty) in results {
497            let version = input.version();
498
499            // Mark all files as seen by the compiler
500            for file in &actually_dirty {
501                cache.compiler_seen(file);
502            }
503
504            let build_info = RawBuildInfo::new(&input, &output, project.build_info)?;
505
506            output.retain_files(
507                actually_dirty
508                    .iter()
509                    .map(|f| f.strip_prefix(project.paths.root.as_path()).unwrap_or(f)),
510            );
511            output.join_all(project.paths.root.as_path());
512
513            aggregated.extend(version.clone(), build_info, profile, output);
514        }
515
516        Ok(aggregated)
517    }
518}
519
520type CompilationResult<'a, I, E, C> = Result<Vec<(I, CompilerOutput<E, C>, &'a str, Vec<PathBuf>)>>;
521
522/// Compiles the input set sequentially and returns a [Vec] of outputs.
523fn compile_sequential<'a, C: Compiler>(
524    compiler: &C,
525    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
526) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
527    jobs.into_iter()
528        .map(|(input, profile, actually_dirty)| {
529            let start = Instant::now();
530            report::compiler_spawn(
531                &input.compiler_name(),
532                input.version(),
533                actually_dirty.as_slice(),
534            );
535            let output = compiler.compile(&input)?;
536            report::compiler_success(&input.compiler_name(), input.version(), &start.elapsed());
537
538            Ok((input, output, profile, actually_dirty))
539        })
540        .collect()
541}
542
543/// compiles the input set using `num_jobs` threads
544fn compile_parallel<'a, C: Compiler>(
545    compiler: &C,
546    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
547    num_jobs: usize,
548) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
549    // need to get the currently installed reporter before installing the pool, otherwise each new
550    // thread in the pool will get initialized with the default value of the `thread_local!`'s
551    // localkey. This way we keep access to the reporter in the rayon pool
552    let scoped_report = report::get_default(|reporter| reporter.clone());
553
554    // start a rayon threadpool that will execute all `Solc::compile()` processes
555    let pool = rayon::ThreadPoolBuilder::new().num_threads(num_jobs).build().unwrap();
556
557    pool.install(move || {
558        jobs.into_par_iter()
559            .map(move |(input, profile, actually_dirty)| {
560                // set the reporter on this thread
561                let _guard = report::set_scoped(&scoped_report);
562
563                let start = Instant::now();
564                report::compiler_spawn(
565                    &input.compiler_name(),
566                    input.version(),
567                    actually_dirty.as_slice(),
568                );
569                compiler.compile(&input).map(move |output| {
570                    report::compiler_success(
571                        &input.compiler_name(),
572                        input.version(),
573                        &start.elapsed(),
574                    );
575                    (input, output, profile, actually_dirty)
576                })
577            })
578            .collect()
579    })
580}
581
582#[cfg(test)]
583#[cfg(all(feature = "project-util", feature = "svm-solc"))]
584mod tests {
585    use std::path::Path;
586
587    use foundry_compilers_artifacts::output_selection::ContractOutputSelection;
588
589    use crate::{
590        compilers::multi::MultiCompiler, project_util::TempProject, ConfigurableArtifacts,
591        MinimalCombinedArtifacts, ProjectPathsConfig,
592    };
593
594    use super::*;
595
596    fn init_tracing() {
597        let _ = tracing_subscriber::fmt()
598            .with_env_filter(tracing_subscriber::EnvFilter::from_default_env())
599            .try_init()
600            .ok();
601    }
602
603    #[test]
604    fn can_preprocess() {
605        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
606        let project = Project::builder()
607            .paths(ProjectPathsConfig::dapptools(&root).unwrap())
608            .build(Default::default())
609            .unwrap();
610
611        let compiler = ProjectCompiler::new(&project).unwrap();
612        let prep = compiler.preprocess().unwrap();
613        let cache = prep.cache.as_cached().unwrap();
614        // ensure that we have exactly 3 empty entries which will be filled on compilation.
615        assert_eq!(cache.cache.files.len(), 3);
616        assert!(cache.cache.files.values().all(|v| v.artifacts.is_empty()));
617
618        let compiled = prep.compile().unwrap();
619        assert_eq!(compiled.output.contracts.files().count(), 3);
620    }
621
622    #[test]
623    fn can_detect_cached_files() {
624        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
625        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
626        let project = TempProject::<MultiCompiler, MinimalCombinedArtifacts>::new(paths).unwrap();
627
628        let compiled = project.compile().unwrap();
629        compiled.assert_success();
630
631        let inner = project.project();
632        let compiler = ProjectCompiler::new(inner).unwrap();
633        let prep = compiler.preprocess().unwrap();
634        assert!(prep.cache.as_cached().unwrap().dirty_sources.is_empty())
635    }
636
637    #[test]
638    fn can_recompile_with_optimized_output() {
639        let tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
640
641        tmp.add_source(
642            "A",
643            r#"
644    pragma solidity ^0.8.10;
645    import "./B.sol";
646    contract A {}
647   "#,
648        )
649        .unwrap();
650
651        tmp.add_source(
652            "B",
653            r#"
654    pragma solidity ^0.8.10;
655    contract B {
656        function hello() public {}
657    }
658    import "./C.sol";
659   "#,
660        )
661        .unwrap();
662
663        tmp.add_source(
664            "C",
665            r"
666    pragma solidity ^0.8.10;
667    contract C {
668            function hello() public {}
669    }
670   ",
671        )
672        .unwrap();
673        let compiled = tmp.compile().unwrap();
674        compiled.assert_success();
675
676        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
677
678        // modify A.sol
679        tmp.add_source(
680            "A",
681            r#"
682    pragma solidity ^0.8.10;
683    import "./B.sol";
684    contract A {
685        function testExample() public {}
686    }
687   "#,
688        )
689        .unwrap();
690
691        let compiler = ProjectCompiler::new(tmp.project()).unwrap();
692        let state = compiler.preprocess().unwrap();
693        let sources = &state.sources.sources;
694
695        let cache = state.cache.as_cached().unwrap();
696
697        // 2 clean sources
698        assert_eq!(cache.cache.artifacts_len(), 2);
699        assert!(cache.cache.all_artifacts_exist());
700        assert_eq!(cache.dirty_sources.len(), 1);
701
702        let len = sources.values().map(|v| v.len()).sum::<usize>();
703        // single solc
704        assert_eq!(len, 1);
705
706        let filtered = &sources.values().next().unwrap()[0].1;
707
708        // 3 contracts total
709        assert_eq!(filtered.0.len(), 3);
710        // A is modified
711        assert_eq!(filtered.dirty().count(), 1);
712        assert!(filtered.dirty_files().next().unwrap().ends_with("A.sol"));
713
714        let state = state.compile().unwrap();
715        assert_eq!(state.output.sources.len(), 1);
716        for (f, source) in state.output.sources.sources() {
717            if f.ends_with("A.sol") {
718                assert!(source.ast.is_some());
719            } else {
720                assert!(source.ast.is_none());
721            }
722        }
723
724        assert_eq!(state.output.contracts.len(), 1);
725        let (a, c) = state.output.contracts_iter().next().unwrap();
726        assert_eq!(a, "A");
727        assert!(c.abi.is_some() && c.evm.is_some());
728
729        let state = state.write_artifacts().unwrap();
730        assert_eq!(state.compiled_artifacts.as_ref().len(), 1);
731
732        let out = state.write_cache().unwrap();
733
734        let artifacts: Vec<_> = out.into_artifacts().collect();
735        assert_eq!(artifacts.len(), 3);
736        for (_, artifact) in artifacts {
737            let c = artifact.into_contract_bytecode();
738            assert!(c.abi.is_some() && c.bytecode.is_some() && c.deployed_bytecode.is_some());
739        }
740
741        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
742    }
743
744    #[test]
745    #[ignore]
746    fn can_compile_real_project() {
747        init_tracing();
748        let paths = ProjectPathsConfig::builder()
749            .root("../../foundry-integration-tests/testdata/solmate")
750            .build()
751            .unwrap();
752        let project = Project::builder().paths(paths).build(Default::default()).unwrap();
753        let compiler = ProjectCompiler::new(&project).unwrap();
754        let _out = compiler.compile().unwrap();
755    }
756
757    #[test]
758    fn extra_output_cached() {
759        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
760        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
761        let mut project = TempProject::<MultiCompiler>::new(paths).unwrap();
762
763        // Compile once without enabled extra output
764        project.compile().unwrap();
765
766        // Enable extra output of abi
767        project.project_mut().artifacts =
768            ConfigurableArtifacts::new([], [ContractOutputSelection::Abi]);
769
770        // Ensure that abi appears after compilation and that we didn't recompile anything
771        let abi_path = project.project().paths.artifacts.join("Dapp.sol/Dapp.abi.json");
772        assert!(!abi_path.exists());
773        let output = project.compile().unwrap();
774        assert!(output.compiler_output.is_empty());
775        assert!(abi_path.exists());
776    }
777
778    #[test]
779    fn can_compile_leftovers_after_sparse() {
780        let mut tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
781
782        tmp.add_source(
783            "A",
784            r#"
785pragma solidity ^0.8.10;
786import "./B.sol";
787contract A {}
788"#,
789        )
790        .unwrap();
791
792        tmp.add_source(
793            "B",
794            r#"
795pragma solidity ^0.8.10;
796contract B {}
797"#,
798        )
799        .unwrap();
800
801        tmp.project_mut().sparse_output = Some(Box::new(|f: &Path| f.ends_with("A.sol")));
802        let compiled = tmp.compile().unwrap();
803        compiled.assert_success();
804        assert_eq!(compiled.artifacts().count(), 1);
805
806        tmp.project_mut().sparse_output = None;
807        let compiled = tmp.compile().unwrap();
808        compiled.assert_success();
809        assert_eq!(compiled.artifacts().count(), 2);
810    }
811}