Skip to main content

foundry_compilers/compile/
project.rs

1//! Manages compiling of a `Project`
2//!
3//! The compilation of a project is performed in several steps.
4//!
5//! First the project's dependency graph [`crate::Graph`] is constructed and all imported
6//! dependencies are resolved. The graph holds all the relationships between the files and their
7//! versions. From there the appropriate version set is derived
8//! [`crate::Graph`] which need to be compiled with different
9//! [`crate::compilers::solc::Solc`] versions.
10//!
11//! At this point we check if we need to compile a source file or whether we can reuse an _existing_
12//! `Artifact`. We don't to compile if:
13//!     - caching is enabled
14//!     - the file is **not** dirty
15//!     - the artifact for that file exists
16//!
17//! This concludes the preprocessing, and we now have either
18//!    - only `Source` files that need to be compiled
19//!    - only cached `Artifacts`, compilation can be skipped. This is considered an unchanged,
20//!      cached project
21//!    - Mix of both `Source` and `Artifacts`, only the `Source` files need to be compiled, the
22//!      `Artifacts` can be reused.
23//!
24//! The final step is invoking `Solc` via the standard JSON format.
25//!
26//! ### Notes on [Import Path Resolution](https://docs.soliditylang.org/en/develop/path-resolution.html#path-resolution)
27//!
28//! In order to be able to support reproducible builds on all platforms, the Solidity compiler has
29//! to abstract away the details of the filesystem where source files are stored. Paths used in
30//! imports must work the same way everywhere while the command-line interface must be able to work
31//! with platform-specific paths to provide good user experience. This section aims to explain in
32//! detail how Solidity reconciles these requirements.
33//!
34//! The compiler maintains an internal database (virtual filesystem or VFS for short) where each
35//! source unit is assigned a unique source unit name which is an opaque and unstructured
36//! identifier. When you use the import statement, you specify an import path that references a
37//! source unit name. If the compiler does not find any source unit name matching the import path in
38//! the VFS, it invokes the callback, which is responsible for obtaining the source code to be
39//! placed under that name.
40//!
41//! This becomes relevant when dealing with resolved imports
42//!
43//! #### Relative Imports
44//!
45//! ```solidity
46//! import "./math/math.sol";
47//! import "contracts/tokens/token.sol";
48//! ```
49//! In the above `./math/math.sol` and `contracts/tokens/token.sol` are import paths while the
50//! source unit names they translate to are `contracts/math/math.sol` and
51//! `contracts/tokens/token.sol` respectively.
52//!
53//! #### Direct Imports
54//!
55//! An import that does not start with `./` or `../` is a direct import.
56//!
57//! ```solidity
58//! import "/project/lib/util.sol";         // source unit name: /project/lib/util.sol
59//! import "lib/util.sol";                  // source unit name: lib/util.sol
60//! import "@openzeppelin/address.sol";     // source unit name: @openzeppelin/address.sol
61//! import "https://example.com/token.sol"; // source unit name: <https://example.com/token.sol>
62//! ```
63//!
64//! After applying any import remappings the import path simply becomes the source unit name.
65//!
66//! ##### Import Remapping
67//!
68//! ```solidity
69//! import "github.com/ethereum/dapp-bin/library/math.sol"; // source unit name: dapp-bin/library/math.sol
70//! ```
71//!
72//! If compiled with `solc github.com/ethereum/dapp-bin/=dapp-bin/` the compiler will look for the
73//! file in the VFS under `dapp-bin/library/math.sol`. If the file is not available there, the
74//! source unit name will be passed to the Host Filesystem Loader, which will then look in
75//! `/project/dapp-bin/library/iterable_mapping.sol`
76//!
77//!
78//! ### Caching and Change detection
79//!
80//! If caching is enabled in the [Project] a cache file will be created upon a successful solc
81//! build. The [cache file](crate::cache::CompilerCache) stores metadata for all the files that were
82//! provided to solc.
83//! For every file the cache file contains a dedicated [cache entry](crate::cache::CacheEntry),
84//! which represents the state of the file. A solidity file can contain several contracts, for every
85//! contract a separate [artifact](crate::Artifact) is emitted. Therefore the entry also tracks all
86//! artifacts emitted by a file. A solidity file can also be compiled with several solc versions.
87//!
88//! For example in `A(<=0.8.10) imports C(>0.4.0)` and
89//! `B(0.8.11) imports C(>0.4.0)`, both `A` and `B` import `C` but there's no solc version that's
90//! compatible with `A` and `B`, in which case two sets are compiled: [`A`, `C`] and [`B`, `C`].
91//! This is reflected in the cache entry which tracks the file's artifacts by version.
92//!
93//! The cache makes it possible to detect changes during recompilation, so that only the changed,
94//! dirty, files need to be passed to solc. A file will be considered as dirty if:
95//!   - the file is new, not included in the existing cache
96//!   - the file was modified since the last compiler run, detected by comparing content hashes
97//!   - any of the imported files is dirty
98//!   - the file's artifacts don't exist, were deleted.
99//!
100//! Recompiling a project with cache enabled detects all files that meet these criteria and provides
101//! solc with only these dirty files instead of the entire source set.
102
103use crate::{
104    ArtifactOutput, CompilerSettings, Graph, Project, ProjectCompileOutput, ProjectPathsConfig,
105    Sources,
106    artifact_output::Artifacts,
107    buildinfo::RawBuildInfo,
108    cache::ArtifactsCache,
109    compilers::{Compiler, CompilerInput, CompilerOutput, Language},
110    filter::SparseOutputFilter,
111    output::{AggregatedCompilerOutput, Builds},
112    report,
113    resolver::{GraphEdges, ResolvedSources},
114};
115use foundry_compilers_core::error::Result;
116use rayon::prelude::*;
117use semver::Version;
118use std::{
119    collections::{HashMap, HashSet},
120    fmt::Debug,
121    path::PathBuf,
122    time::Instant,
123};
124
125/// A set of different Solc installations with their version and the sources to be compiled
126pub(crate) type VersionedSources<'a, L, S> = HashMap<L, Vec<(Version, Sources, (&'a str, &'a S))>>;
127
128/// Invoked before the actual compiler invocation and can override the input.
129///
130/// Updates the list of identified cached mocks (if any) to be stored in cache and updates the
131/// compiler input.
132pub trait Preprocessor<C: Compiler>: Debug {
133    fn preprocess(
134        &self,
135        compiler: &C,
136        input: &mut C::Input,
137        paths: &ProjectPathsConfig<C::Language>,
138        mocks: &mut HashSet<PathBuf>,
139    ) -> Result<()>;
140}
141
142#[derive(Debug)]
143pub struct ProjectCompiler<
144    'a,
145    T: ArtifactOutput<CompilerContract = C::CompilerContract>,
146    C: Compiler,
147> {
148    /// Contains the relationship of the source files and their imports
149    edges: GraphEdges<C::Parser>,
150    project: &'a Project<C, T>,
151    /// A mapping from a source file path to the primary profile name selected for it.
152    primary_profiles: HashMap<PathBuf, &'a str>,
153    /// how to compile all the sources
154    sources: CompilerSources<'a, C::Language, C::Settings>,
155    /// Optional preprocessor
156    preprocessor: Option<Box<dyn Preprocessor<C>>>,
157}
158
159impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
160    ProjectCompiler<'a, T, C>
161{
162    /// Create a new `ProjectCompiler` to bootstrap the compilation process of the project's
163    /// sources.
164    pub fn new(project: &'a Project<C, T>) -> Result<Self> {
165        Self::with_sources(project, project.paths.read_input_files()?)
166    }
167
168    /// Bootstraps the compilation process by resolving the dependency graph of all sources and the
169    /// appropriate `Solc` -> `Sources` set as well as the compile mode to use (parallel,
170    /// sequential)
171    ///
172    /// Multiple (`Solc` -> `Sources`) pairs can be compiled in parallel if the `Project` allows
173    /// multiple `jobs`, see [`crate::Project::set_solc_jobs()`].
174    #[instrument(name = "ProjectCompiler::new", skip_all)]
175    pub fn with_sources(project: &'a Project<C, T>, mut sources: Sources) -> Result<Self> {
176        if let Some(filter) = &project.sparse_output {
177            sources.retain(|f, _| filter.is_match(f))
178        }
179        let graph = Graph::resolve_sources(&project.paths, sources)?;
180        let ResolvedSources { sources, primary_profiles, edges } =
181            graph.into_sources_by_version(project)?;
182
183        // If there are multiple different versions, and we can use multiple jobs we can compile
184        // them in parallel.
185        let jobs_cnt = || sources.values().map(|v| v.len()).sum::<usize>();
186        let sources = CompilerSources {
187            jobs: (project.solc_jobs > 1 && jobs_cnt() > 1).then_some(project.solc_jobs),
188            sources,
189        };
190
191        Ok(Self { edges, primary_profiles, project, sources, preprocessor: None })
192    }
193
194    pub fn with_preprocessor(self, preprocessor: impl Preprocessor<C> + 'static) -> Self {
195        Self { preprocessor: Some(Box::new(preprocessor)), ..self }
196    }
197
198    /// Compiles all the sources of the `Project` in the appropriate mode
199    ///
200    /// If caching is enabled, the sources are filtered and only _dirty_ sources are recompiled.
201    ///
202    /// The output of the compile process can be a mix of reused artifacts and freshly compiled
203    /// `Contract`s
204    ///
205    /// # Examples
206    /// ```no_run
207    /// use foundry_compilers::Project;
208    ///
209    /// let project = Project::builder().build(Default::default())?;
210    /// let output = project.compile()?;
211    /// # Ok::<(), Box<dyn std::error::Error>>(())
212    /// ```
213    #[instrument(name = "compile_project", skip_all)]
214    pub fn compile(self) -> Result<ProjectCompileOutput<C, T>> {
215        let slash_paths = self.project.slash_paths;
216
217        // drive the compiler statemachine to completion
218        let mut output = self.preprocess()?.compile()?.write_artifacts()?.write_cache()?;
219
220        if slash_paths {
221            // ensures we always use `/` paths
222            output.slash_paths();
223        }
224
225        Ok(output)
226    }
227
228    /// Does basic preprocessing
229    ///   - sets proper source unit names
230    ///   - check cache
231    #[instrument(skip_all)]
232    fn preprocess(self) -> Result<PreprocessedState<'a, T, C>> {
233        trace!("preprocessing");
234        let Self { edges, project, mut sources, primary_profiles, preprocessor } = self;
235
236        // convert paths on windows to ensure consistency with the `CompilerOutput` `solc` emits,
237        // which is unix style `/`
238        sources.slash_paths();
239
240        let mut cache = ArtifactsCache::new(project, edges, preprocessor.is_some())?;
241        // retain and compile only dirty sources and all their imports
242        sources.filter(&mut cache);
243
244        Ok(PreprocessedState { sources, cache, primary_profiles, preprocessor })
245    }
246}
247
248/// A series of states that comprise the [`ProjectCompiler::compile()`] state machine
249///
250/// The main reason is to debug all states individually
251#[derive(Debug)]
252struct PreprocessedState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
253{
254    /// Contains all the sources to compile.
255    sources: CompilerSources<'a, C::Language, C::Settings>,
256
257    /// Cache that holds `CacheEntry` objects if caching is enabled and the project is recompiled
258    cache: ArtifactsCache<'a, T, C>,
259
260    /// A mapping from a source file path to the primary profile name selected for it.
261    primary_profiles: HashMap<PathBuf, &'a str>,
262
263    /// Optional preprocessor
264    preprocessor: Option<Box<dyn Preprocessor<C>>>,
265}
266
267impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
268    PreprocessedState<'a, T, C>
269{
270    /// advance to the next state by compiling all sources
271    #[instrument(skip_all)]
272    fn compile(self) -> Result<CompiledState<'a, T, C>> {
273        trace!("compiling");
274        let PreprocessedState { sources, mut cache, primary_profiles, preprocessor } = self;
275
276        let mut output = sources.compile(&mut cache, preprocessor)?;
277
278        // source paths get stripped before handing them over to solc, so solc never uses absolute
279        // paths, instead `--base-path <root dir>` is set. this way any metadata that's derived from
280        // data (paths) is relative to the project dir and should be independent of the current OS
281        // disk. However internally we still want to keep absolute paths, so we join the
282        // contracts again
283        output.join_all(cache.project().root());
284
285        Ok(CompiledState { output, cache, primary_profiles })
286    }
287}
288
289/// Represents the state after `solc` was successfully invoked
290#[derive(Debug)]
291struct CompiledState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
292    output: AggregatedCompilerOutput<C>,
293    cache: ArtifactsCache<'a, T, C>,
294    primary_profiles: HashMap<PathBuf, &'a str>,
295}
296
297impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
298    CompiledState<'a, T, C>
299{
300    /// advance to the next state by handling all artifacts
301    ///
302    /// Writes all output contracts to disk if enabled in the `Project` and if the build was
303    /// successful
304    #[instrument(skip_all)]
305    fn write_artifacts(self) -> Result<ArtifactsState<'a, T, C>> {
306        let CompiledState { output, cache, primary_profiles } = self;
307
308        let project = cache.project();
309        let ctx = cache.output_ctx();
310        // write all artifacts via the handler but only if the build succeeded and project wasn't
311        // configured with `no_artifacts == true`
312        let compiled_artifacts = if project.no_artifacts {
313            project.artifacts_handler().output_to_artifacts(
314                &output.contracts,
315                &output.sources,
316                ctx,
317                &project.paths,
318                &primary_profiles,
319            )
320        } else if output.has_error(
321            &project.ignored_error_codes,
322            &project.ignored_error_codes_from,
323            &project.ignored_file_paths,
324            &project.compiler_severity_filter,
325        ) {
326            trace!("skip writing cache file due to solc errors: {:?}", output.errors);
327            project.artifacts_handler().output_to_artifacts(
328                &output.contracts,
329                &output.sources,
330                ctx,
331                &project.paths,
332                &primary_profiles,
333            )
334        } else {
335            trace!(
336                "handling artifact output for {} contracts and {} sources",
337                output.contracts.len(),
338                output.sources.len()
339            );
340            // this emits the artifacts via the project's artifacts handler
341            let artifacts = project.artifacts_handler().on_output(
342                &output.contracts,
343                &output.sources,
344                &project.paths,
345                ctx,
346                &primary_profiles,
347            )?;
348
349            // emits all the build infos, if they exist
350            output.write_build_infos(project.build_info_path())?;
351
352            artifacts
353        };
354
355        Ok(ArtifactsState { output, cache, compiled_artifacts })
356    }
357}
358
359/// Represents the state after all artifacts were written to disk
360#[derive(Debug)]
361struct ArtifactsState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
362    output: AggregatedCompilerOutput<C>,
363    cache: ArtifactsCache<'a, T, C>,
364    compiled_artifacts: Artifacts<T::Artifact>,
365}
366
367impl<T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
368    ArtifactsState<'_, T, C>
369{
370    /// Writes the cache file
371    ///
372    /// this concludes the [`Project::compile()`] statemachine
373    #[instrument(skip_all)]
374    fn write_cache(self) -> Result<ProjectCompileOutput<C, T>> {
375        let ArtifactsState { output, cache, compiled_artifacts } = self;
376        let project = cache.project();
377        let ignored_error_codes = project.ignored_error_codes.clone();
378        let ignored_error_codes_from = project.ignored_error_codes_from.clone();
379        let ignored_file_paths = project.ignored_file_paths.clone();
380        let compiler_severity_filter = project.compiler_severity_filter;
381        let has_error = output.has_error(
382            &ignored_error_codes,
383            &ignored_error_codes_from,
384            &ignored_file_paths,
385            &compiler_severity_filter,
386        );
387        let skip_write_to_disk = project.no_artifacts || has_error;
388        trace!(has_error, project.no_artifacts, skip_write_to_disk, cache_path=?project.cache_path(),"prepare writing cache file");
389
390        let (cached_artifacts, cached_builds, edges) =
391            cache.consume(&compiled_artifacts, &output.build_infos, !skip_write_to_disk)?;
392
393        project.artifacts_handler().handle_cached_artifacts(&cached_artifacts)?;
394
395        let builds = Builds(
396            output
397                .build_infos
398                .iter()
399                .map(|build_info| (build_info.id.clone(), build_info.build_context.clone()))
400                .chain(cached_builds)
401                .map(|(id, context)| (id, context.with_joined_paths(project.paths.root.as_path())))
402                .collect(),
403        );
404
405        Ok(ProjectCompileOutput {
406            compiler_output: output,
407            compiled_artifacts,
408            cached_artifacts,
409            ignored_error_codes,
410            ignored_error_codes_from,
411            ignored_file_paths,
412            compiler_severity_filter,
413            builds,
414            edges,
415        })
416    }
417}
418
419/// Determines how the `solc <-> sources` pairs are executed.
420#[derive(Debug, Clone)]
421struct CompilerSources<'a, L, S> {
422    /// The sources to compile.
423    sources: VersionedSources<'a, L, S>,
424    /// The number of jobs to use for parallel compilation.
425    jobs: Option<usize>,
426}
427
428impl<L: Language, S: CompilerSettings> CompilerSources<'_, L, S> {
429    /// Converts all `\\` separators to `/`.
430    ///
431    /// This effectively ensures that `solc` can find imported files like `/src/Cheats.sol` in the
432    /// VFS (the `CompilerInput` as json) under `src/Cheats.sol`.
433    #[allow(clippy::missing_const_for_fn)]
434    fn slash_paths(&mut self) {
435        #[cfg(windows)]
436        {
437            use path_slash::PathBufExt;
438
439            self.sources.values_mut().for_each(|versioned_sources| {
440                versioned_sources.iter_mut().for_each(|(_, sources, _)| {
441                    *sources = std::mem::take(sources)
442                        .into_iter()
443                        .map(|(path, source)| {
444                            (PathBuf::from(path.to_slash_lossy().as_ref()), source)
445                        })
446                        .collect()
447                })
448            });
449        }
450    }
451
452    /// Filters out all sources that don't need to be compiled, see [`ArtifactsCache::filter`]
453    #[instrument(name = "CompilerSources::filter", skip_all)]
454    fn filter<
455        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
456        C: Compiler<Language = L>,
457    >(
458        &mut self,
459        cache: &mut ArtifactsCache<'_, T, C>,
460    ) {
461        cache.remove_dirty_sources();
462        for versioned_sources in self.sources.values_mut() {
463            for (version, sources, (profile, _)) in versioned_sources {
464                trace!("Filtering {} sources for {}", sources.len(), version);
465                cache.filter(sources, version, profile);
466                trace!(
467                    "Detected {} sources to compile {:?}",
468                    sources.dirty().count(),
469                    sources.dirty_files().collect::<Vec<_>>()
470                );
471            }
472        }
473    }
474
475    /// Compiles all the files with `Solc`
476    fn compile<
477        C: Compiler<Language = L, Settings = S>,
478        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
479    >(
480        self,
481        cache: &mut ArtifactsCache<'_, T, C>,
482        preprocessor: Option<Box<dyn Preprocessor<C>>>,
483    ) -> Result<AggregatedCompilerOutput<C>> {
484        let project = cache.project();
485        let graph = cache.graph();
486
487        let jobs_cnt = self.jobs;
488
489        let sparse_output = SparseOutputFilter::new(project.sparse_output.as_deref());
490
491        // Include additional paths collected during graph resolution.
492        let mut include_paths = project.paths.include_paths.clone();
493        include_paths.extend(graph.include_paths().clone());
494
495        // Get current list of mocks from cache. This will be passed to preprocessors and updated
496        // accordingly, then set back in cache.
497        let mut mocks = cache.mocks();
498
499        let mut jobs = Vec::new();
500        for (language, versioned_sources) in self.sources {
501            for (version, sources, (profile, opt_settings)) in versioned_sources {
502                let mut opt_settings = opt_settings.clone();
503                if sources.is_empty() {
504                    // nothing to compile
505                    trace!("skip {} for empty sources set", version);
506                    continue;
507                }
508
509                // depending on the composition of the filtered sources, the output selection can be
510                // optimized
511                let actually_dirty =
512                    sparse_output.sparse_sources(&sources, &mut opt_settings, graph);
513
514                if actually_dirty.is_empty() {
515                    // nothing to compile for this particular language, all dirty files are in the
516                    // other language set
517                    trace!("skip {} run due to empty source set", version);
518                    continue;
519                }
520
521                trace!("calling {} with {} sources {:?}", version, sources.len(), sources.keys());
522
523                let settings = opt_settings
524                    .with_base_path(&project.paths.root)
525                    .with_allow_paths(&project.paths.allowed_paths)
526                    .with_include_paths(&include_paths)
527                    .with_remappings(&project.paths.remappings);
528
529                let mut input = C::Input::build(sources, settings, language, version.clone());
530
531                input.strip_prefix(project.paths.root.as_path());
532
533                if let Some(preprocessor) = preprocessor.as_ref() {
534                    preprocessor.preprocess(
535                        &project.compiler,
536                        &mut input,
537                        &project.paths,
538                        &mut mocks,
539                    )?;
540                }
541
542                jobs.push((input, profile, actually_dirty));
543            }
544        }
545
546        // Update cache with mocks updated by preprocessors.
547        cache.update_mocks(mocks);
548
549        let results = if let Some(num_jobs) = jobs_cnt {
550            compile_parallel(&project.compiler, jobs, num_jobs)
551        } else {
552            compile_sequential(&project.compiler, jobs)
553        }?;
554
555        let mut aggregated = AggregatedCompilerOutput::default();
556
557        for (input, mut output, profile, actually_dirty) in results {
558            let version = input.version();
559
560            // Mark all files as seen by the compiler
561            for file in &actually_dirty {
562                cache.compiler_seen(file);
563            }
564
565            let build_info = RawBuildInfo::new(&input, &output, project.build_info)?;
566
567            output.retain_files(
568                actually_dirty
569                    .iter()
570                    .map(|f| f.strip_prefix(project.paths.root.as_path()).unwrap_or(f)),
571            );
572            output.join_all(project.paths.root.as_path());
573
574            aggregated.extend(version.clone(), build_info, profile, output);
575        }
576
577        Ok(aggregated)
578    }
579}
580
581type CompilationResult<'a, I, E, C> = Result<Vec<(I, CompilerOutput<E, C>, &'a str, Vec<PathBuf>)>>;
582
583/// Compiles the input set sequentially and returns a [Vec] of outputs.
584fn compile_sequential<'a, C: Compiler>(
585    compiler: &C,
586    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
587) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
588    jobs.into_iter()
589        .map(|(input, profile, actually_dirty)| {
590            let start = Instant::now();
591            report::compiler_spawn(
592                &input.compiler_name(),
593                input.version(),
594                actually_dirty.as_slice(),
595            );
596            let output = compiler.compile(&input)?;
597            report::compiler_success(&input.compiler_name(), input.version(), &start.elapsed());
598
599            Ok((input, output, profile, actually_dirty))
600        })
601        .collect()
602}
603
604/// compiles the input set using `num_jobs` threads
605fn compile_parallel<'a, C: Compiler>(
606    compiler: &C,
607    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
608    num_jobs: usize,
609) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
610    // need to get the currently installed reporter before installing the pool, otherwise each new
611    // thread in the pool will get initialized with the default value of the `thread_local!`'s
612    // localkey. This way we keep access to the reporter in the rayon pool
613    let scoped_report = report::get_default(|reporter| reporter.clone());
614
615    // start a rayon threadpool that will execute all `Solc::compile()` processes
616    let pool = rayon::ThreadPoolBuilder::new().num_threads(num_jobs).build().unwrap();
617
618    pool.install(move || {
619        jobs.into_par_iter()
620            .map(move |(input, profile, actually_dirty)| {
621                // set the reporter on this thread
622                let _guard = report::set_scoped(&scoped_report);
623
624                let start = Instant::now();
625                report::compiler_spawn(
626                    &input.compiler_name(),
627                    input.version(),
628                    actually_dirty.as_slice(),
629                );
630                compiler.compile(&input).map(move |output| {
631                    report::compiler_success(
632                        &input.compiler_name(),
633                        input.version(),
634                        &start.elapsed(),
635                    );
636                    (input, output, profile, actually_dirty)
637                })
638            })
639            .collect()
640    })
641}
642
643#[cfg(test)]
644#[cfg(all(feature = "project-util", feature = "svm-solc"))]
645mod tests {
646    use std::path::Path;
647
648    use foundry_compilers_artifacts::output_selection::ContractOutputSelection;
649
650    use crate::{
651        ConfigurableArtifacts, MinimalCombinedArtifacts, compilers::multi::MultiCompiler,
652        project_util::TempProject,
653    };
654
655    use super::*;
656
657    fn init_tracing() {
658        let _ = tracing_subscriber::fmt()
659            .with_env_filter(tracing_subscriber::EnvFilter::from_default_env())
660            .try_init()
661            .ok();
662    }
663
664    #[test]
665    fn can_preprocess() {
666        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
667        let project = Project::builder()
668            .paths(ProjectPathsConfig::dapptools(&root).unwrap())
669            .build(Default::default())
670            .unwrap();
671
672        let compiler = ProjectCompiler::new(&project).unwrap();
673        let prep = compiler.preprocess().unwrap();
674        let cache = prep.cache.as_cached().unwrap();
675        // ensure that we have exactly 3 empty entries which will be filled on compilation.
676        assert_eq!(cache.cache.files.len(), 3);
677        assert!(cache.cache.files.values().all(|v| v.artifacts.is_empty()));
678
679        let compiled = prep.compile().unwrap();
680        assert_eq!(compiled.output.contracts.files().count(), 3);
681    }
682
683    #[test]
684    fn can_detect_cached_files() {
685        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
686        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
687        let project = TempProject::<MultiCompiler, MinimalCombinedArtifacts>::new(paths).unwrap();
688
689        let compiled = project.compile().unwrap();
690        compiled.assert_success();
691
692        let inner = project.project();
693        let compiler = ProjectCompiler::new(inner).unwrap();
694        let prep = compiler.preprocess().unwrap();
695        assert!(prep.cache.as_cached().unwrap().dirty_sources.is_empty())
696    }
697
698    #[test]
699    fn can_recompile_with_optimized_output() {
700        let tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
701
702        tmp.add_source(
703            "A",
704            r#"
705    pragma solidity ^0.8.10;
706    import "./B.sol";
707    contract A {}
708   "#,
709        )
710        .unwrap();
711
712        tmp.add_source(
713            "B",
714            r#"
715    pragma solidity ^0.8.10;
716    contract B {
717        function hello() public {}
718    }
719    import "./C.sol";
720   "#,
721        )
722        .unwrap();
723
724        tmp.add_source(
725            "C",
726            r"
727    pragma solidity ^0.8.10;
728    contract C {
729            function hello() public {}
730    }
731   ",
732        )
733        .unwrap();
734        let compiled = tmp.compile().unwrap();
735        compiled.assert_success();
736
737        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
738
739        // modify A.sol
740        tmp.add_source(
741            "A",
742            r#"
743    pragma solidity ^0.8.10;
744    import "./B.sol";
745    contract A {
746        function testExample() public {}
747    }
748   "#,
749        )
750        .unwrap();
751
752        let compiler = ProjectCompiler::new(tmp.project()).unwrap();
753        let state = compiler.preprocess().unwrap();
754        let sources = &state.sources.sources;
755
756        let cache = state.cache.as_cached().unwrap();
757
758        // 2 clean sources
759        assert_eq!(cache.cache.artifacts_len(), 2);
760        assert!(cache.cache.all_artifacts_exist());
761        assert_eq!(cache.dirty_sources.len(), 1);
762
763        let len = sources.values().map(|v| v.len()).sum::<usize>();
764        // single solc
765        assert_eq!(len, 1);
766
767        let filtered = &sources.values().next().unwrap()[0].1;
768
769        // 3 contracts total
770        assert_eq!(filtered.0.len(), 3);
771        // A is modified
772        assert_eq!(filtered.dirty().count(), 1);
773        assert!(filtered.dirty_files().next().unwrap().ends_with("A.sol"));
774
775        let state = state.compile().unwrap();
776        assert_eq!(state.output.sources.len(), 1);
777        for (f, source) in state.output.sources.sources() {
778            if f.ends_with("A.sol") {
779                assert!(source.ast.is_some());
780            } else {
781                assert!(source.ast.is_none());
782            }
783        }
784
785        assert_eq!(state.output.contracts.len(), 1);
786        let (a, c) = state.output.contracts_iter().next().unwrap();
787        assert_eq!(a, "A");
788        assert!(c.abi.is_some() && c.evm.is_some());
789
790        let state = state.write_artifacts().unwrap();
791        assert_eq!(state.compiled_artifacts.as_ref().len(), 1);
792
793        let out = state.write_cache().unwrap();
794
795        let artifacts: Vec<_> = out.into_artifacts().collect();
796        assert_eq!(artifacts.len(), 3);
797        for (_, artifact) in artifacts {
798            let c = artifact.into_contract_bytecode();
799            assert!(c.abi.is_some() && c.bytecode.is_some() && c.deployed_bytecode.is_some());
800        }
801
802        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
803    }
804
805    #[test]
806    #[ignore]
807    fn can_compile_real_project() {
808        init_tracing();
809        let paths = ProjectPathsConfig::builder()
810            .root("../../foundry-integration-tests/testdata/solmate")
811            .build()
812            .unwrap();
813        let project = Project::builder().paths(paths).build(Default::default()).unwrap();
814        let compiler = ProjectCompiler::new(&project).unwrap();
815        let _out = compiler.compile().unwrap();
816    }
817
818    #[test]
819    fn extra_output_cached() {
820        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
821        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
822        let mut project = TempProject::<MultiCompiler>::new(paths).unwrap();
823
824        // Compile once without enabled extra output
825        project.compile().unwrap();
826
827        // Enable extra output of abi
828        project.project_mut().artifacts =
829            ConfigurableArtifacts::new([], [ContractOutputSelection::Abi]);
830
831        // Ensure that abi appears after compilation and that we didn't recompile anything
832        let abi_path = project.project().paths.artifacts.join("Dapp.sol/Dapp.abi.json");
833        assert!(!abi_path.exists());
834        let output = project.compile().unwrap();
835        assert!(output.compiler_output.is_empty());
836        assert!(abi_path.exists());
837    }
838
839    #[test]
840    fn can_compile_leftovers_after_sparse() {
841        let mut tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
842
843        tmp.add_source(
844            "A",
845            r#"
846pragma solidity ^0.8.10;
847import "./B.sol";
848contract A {}
849"#,
850        )
851        .unwrap();
852
853        tmp.add_source(
854            "B",
855            r#"
856pragma solidity ^0.8.10;
857contract B {}
858"#,
859        )
860        .unwrap();
861
862        tmp.project_mut().sparse_output = Some(Box::new(|f: &Path| f.ends_with("A.sol")));
863        let compiled = tmp.compile().unwrap();
864        compiled.assert_success();
865        assert_eq!(compiled.artifacts().count(), 1);
866
867        tmp.project_mut().sparse_output = None;
868        let compiled = tmp.compile().unwrap();
869        compiled.assert_success();
870        assert_eq!(compiled.artifacts().count(), 2);
871    }
872}