ethers_solc/compile/
project.rs

1//! Manages compiling of a `Project`
2//!
3//! The compilation of a project is performed in several steps.
4//!
5//! First the project's dependency graph [`crate::Graph`] is constructed and all imported
6//! dependencies are resolved. The graph holds all the relationships between the files and their
7//! versions. From there the appropriate version set is derived
8//! [`crate::Graph`] which need to be compiled with different
9//! [`crate::Solc`] versions.
10//!
11//! At this point we check if we need to compile a source file or whether we can reuse an _existing_
12//! `Artifact`. We don't to compile if:
13//!     - caching is enabled
14//!     - the file is **not** dirty
15//!     - the artifact for that file exists
16//!
17//! This concludes the preprocessing, and we now have either
18//!    - only `Source` files that need to be compiled
19//!    - only cached `Artifacts`, compilation can be skipped. This is considered an unchanged,
20//!      cached project
21//!    - Mix of both `Source` and `Artifacts`, only the `Source` files need to be compiled, the
22//!      `Artifacts` can be reused.
23//!
24//! The final step is invoking `Solc` via the standard JSON format.
25//!
26//! ### Notes on [Import Path Resolution](https://docs.soliditylang.org/en/develop/path-resolution.html#path-resolution)
27//!
28//! In order to be able to support reproducible builds on all platforms, the Solidity compiler has
29//! to abstract away the details of the filesystem where source files are stored. Paths used in
30//! imports must work the same way everywhere while the command-line interface must be able to work
31//! with platform-specific paths to provide good user experience. This section aims to explain in
32//! detail how Solidity reconciles these requirements.
33//!
34//! The compiler maintains an internal database (virtual filesystem or VFS for short) where each
35//! source unit is assigned a unique source unit name which is an opaque and unstructured
36//! identifier. When you use the import statement, you specify an import path that references a
37//! source unit name. If the compiler does not find any source unit name matching the import path in
38//! the VFS, it invokes the callback, which is responsible for obtaining the source code to be
39//! placed under that name.
40//!
41//! This becomes relevant when dealing with resolved imports
42//!
43//! #### Relative Imports
44//!
45//! ```solidity
46//! import "./math/math.sol";
47//! import "contracts/tokens/token.sol";
48//! ```
49//! In the above `./math/math.sol` and `contracts/tokens/token.sol` are import paths while the
50//! source unit names they translate to are `contracts/math/math.sol` and
51//! `contracts/tokens/token.sol` respectively.
52//!
53//! #### Direct Imports
54//!
55//! An import that does not start with `./` or `../` is a direct import.
56//!
57//! ```solidity
58//! import "/project/lib/util.sol";         // source unit name: /project/lib/util.sol
59//! import "lib/util.sol";                  // source unit name: lib/util.sol
60//! import "@openzeppelin/address.sol";     // source unit name: @openzeppelin/address.sol
61//! import "https://example.com/token.sol"; // source unit name: <https://example.com/token.sol>
62//! ```
63//!
64//! After applying any import remappings the import path simply becomes the source unit name.
65//!
66//! ##### Import Remapping
67//!
68//! ```solidity
69//! import "github.com/ethereum/dapp-bin/library/math.sol"; // source unit name: dapp-bin/library/math.sol
70//! ```
71//!
72//! If compiled with `solc github.com/ethereum/dapp-bin/=dapp-bin/` the compiler will look for the
73//! file in the VFS under `dapp-bin/library/math.sol`. If the file is not available there, the
74//! source unit name will be passed to the Host Filesystem Loader, which will then look in
75//! `/project/dapp-bin/library/iterable_mapping.sol`
76//!
77//!
78//! ### Caching and Change detection
79//!
80//! If caching is enabled in the [Project] a cache file will be created upon a successful solc
81//! build. The [cache file](crate::cache::SolFilesCache) stores metadata for all the files that were
82//! provided to solc.
83//! For every file the cache file contains a dedicated [cache entry](crate::cache::CacheEntry),
84//! which represents the state of the file. A solidity file can contain several contracts, for every
85//! contract a separate [artifact](crate::Artifact) is emitted. Therefor the entry also tracks all
86//! artifacts emitted by a file. A solidity file can also be compiled with several solc versions.
87//!
88//! For example in `A(<=0.8.10) imports C(>0.4.0)` and
89//! `B(0.8.11) imports C(>0.4.0)`, both `A` and `B` import `C` but there's no solc version that's
90//! compatible with `A` and `B`, in which case two sets are compiled: [`A`, `C`] and [`B`, `C`].
91//! This is reflected in the cache entry which tracks the file's artifacts by version.
92//!
93//! The cache makes it possible to detect changes during recompilation, so that only the changed,
94//! dirty, files need to be passed to solc. A file will be considered as dirty if:
95//!   - the file is new, not included in the existing cache
96//!   - the file was modified since the last compiler run, detected by comparing content hashes
97//!   - any of the imported files is dirty
98//!   - the file's artifacts don't exist, were deleted.
99//!
100//! Recompiling a project with cache enabled detects all files that meet these criteria and provides
101//! solc with only these dirty files instead of the entire source set.
102
103use crate::{
104    artifact_output::Artifacts,
105    artifacts::{Settings, VersionedFilteredSources, VersionedSources},
106    buildinfo::RawBuildInfo,
107    cache::ArtifactsCache,
108    error::Result,
109    filter::SparseOutputFilter,
110    output::AggregatedCompilerOutput,
111    report,
112    resolver::GraphEdges,
113    ArtifactOutput, CompilerInput, Graph, Project, ProjectCompileOutput, ProjectPathsConfig, Solc,
114    Sources,
115};
116use rayon::prelude::*;
117use std::{collections::btree_map::BTreeMap, path::PathBuf, time::Instant};
118use tracing::trace;
119
120#[derive(Debug)]
121pub struct ProjectCompiler<'a, T: ArtifactOutput> {
122    /// Contains the relationship of the source files and their imports
123    edges: GraphEdges,
124    project: &'a Project<T>,
125    /// how to compile all the sources
126    sources: CompilerSources,
127    /// How to select solc [`crate::artifacts::CompilerOutput`] for files
128    sparse_output: SparseOutputFilter,
129}
130
131impl<'a, T: ArtifactOutput> ProjectCompiler<'a, T> {
132    /// Create a new `ProjectCompiler` to bootstrap the compilation process of the project's
133    /// sources.
134    ///
135    /// # Example
136    ///
137    /// ```no_run
138    /// use ethers_solc::Project;
139    ///
140    /// let project = Project::builder().build().unwrap();
141    /// let output = project.compile().unwrap();
142    /// ```
143    #[cfg(all(feature = "svm-solc", not(target_arch = "wasm32")))]
144    pub fn new(project: &'a Project<T>) -> Result<Self> {
145        Self::with_sources(project, project.paths.read_input_files()?)
146    }
147
148    /// Bootstraps the compilation process by resolving the dependency graph of all sources and the
149    /// appropriate `Solc` -> `Sources` set as well as the compile mode to use (parallel,
150    /// sequential)
151    ///
152    /// Multiple (`Solc` -> `Sources`) pairs can be compiled in parallel if the `Project` allows
153    /// multiple `jobs`, see [`crate::Project::set_solc_jobs()`].
154    #[cfg(all(feature = "svm-solc", not(target_arch = "wasm32")))]
155    pub fn with_sources(project: &'a Project<T>, sources: Sources) -> Result<Self> {
156        let graph = Graph::resolve_sources(&project.paths, sources)?;
157        let (versions, edges) = graph.into_sources_by_version(project.offline)?;
158
159        let sources_by_version = versions.get(project)?;
160
161        let sources = if project.solc_jobs > 1 && sources_by_version.len() > 1 {
162            // if there are multiple different versions, and we can use multiple jobs we can compile
163            // them in parallel
164            CompilerSources::Parallel(sources_by_version, project.solc_jobs)
165        } else {
166            CompilerSources::Sequential(sources_by_version)
167        };
168
169        Ok(Self { edges, project, sources, sparse_output: Default::default() })
170    }
171
172    /// Compiles the sources with a pinned `Solc` instance
173    pub fn with_sources_and_solc(
174        project: &'a Project<T>,
175        sources: Sources,
176        solc: Solc,
177    ) -> Result<Self> {
178        let version = solc.version()?;
179        let (sources, edges) = Graph::resolve_sources(&project.paths, sources)?.into_sources();
180
181        // make sure `solc` has all required arguments
182        let solc = project.configure_solc_with_version(
183            solc,
184            Some(version.clone()),
185            edges.include_paths().clone(),
186        );
187
188        let sources_by_version = BTreeMap::from([(solc, (version, sources))]);
189        let sources = CompilerSources::Sequential(sources_by_version);
190
191        Ok(Self { edges, project, sources, sparse_output: Default::default() })
192    }
193
194    /// Applies the specified filter to be applied when selecting solc output for
195    /// specific files to be compiled
196    pub fn with_sparse_output(mut self, sparse_output: impl Into<SparseOutputFilter>) -> Self {
197        self.sparse_output = sparse_output.into();
198        self
199    }
200
201    /// Compiles all the sources of the `Project` in the appropriate mode
202    ///
203    /// If caching is enabled, the sources are filtered and only _dirty_ sources are recompiled.
204    ///
205    /// The output of the compile process can be a mix of reused artifacts and freshly compiled
206    /// `Contract`s
207    ///
208    /// # Example
209    ///
210    /// ```no_run
211    /// use ethers_solc::Project;
212    ///
213    /// let project = Project::builder().build().unwrap();
214    /// let output = project.compile().unwrap();
215    /// ```
216    pub fn compile(self) -> Result<ProjectCompileOutput<T>> {
217        let slash_paths = self.project.slash_paths;
218
219        // drive the compiler statemachine to completion
220        let mut output = self.preprocess()?.compile()?.write_artifacts()?.write_cache()?;
221
222        if slash_paths {
223            // ensures we always use `/` paths
224            output.slash_paths();
225        }
226
227        Ok(output)
228    }
229
230    /// Does basic preprocessing
231    ///   - sets proper source unit names
232    ///   - check cache
233    fn preprocess(self) -> Result<PreprocessedState<'a, T>> {
234        trace!("preprocessing");
235        let Self { edges, project, mut sources, sparse_output } = self;
236
237        // convert paths on windows to ensure consistency with the `CompilerOutput` `solc` emits,
238        // which is unix style `/`
239        sources.slash_paths();
240
241        let mut cache = ArtifactsCache::new(project, edges)?;
242        // retain and compile only dirty sources and all their imports
243        let sources = sources.filtered(&mut cache);
244
245        Ok(PreprocessedState { sources, cache, sparse_output })
246    }
247}
248
249/// A series of states that comprise the [`ProjectCompiler::compile()`] state machine
250///
251/// The main reason is to debug all states individually
252#[derive(Debug)]
253struct PreprocessedState<'a, T: ArtifactOutput> {
254    /// Contains all the sources to compile.
255    sources: FilteredCompilerSources,
256
257    /// Cache that holds `CacheEntry` objects if caching is enabled and the project is recompiled
258    cache: ArtifactsCache<'a, T>,
259
260    sparse_output: SparseOutputFilter,
261}
262
263impl<'a, T: ArtifactOutput> PreprocessedState<'a, T> {
264    /// advance to the next state by compiling all sources
265    fn compile(self) -> Result<CompiledState<'a, T>> {
266        trace!("compiling");
267        let PreprocessedState { sources, cache, sparse_output } = self;
268        let project = cache.project();
269        let mut output = sources.compile(
270            &project.solc_config.settings,
271            &project.paths,
272            sparse_output,
273            cache.graph(),
274            project.build_info,
275        )?;
276
277        // source paths get stripped before handing them over to solc, so solc never uses absolute
278        // paths, instead `--base-path <root dir>` is set. this way any metadata that's derived from
279        // data (paths) is relative to the project dir and should be independent of the current OS
280        // disk. However internally we still want to keep absolute paths, so we join the
281        // contracts again
282        output.join_all(cache.project().root());
283
284        Ok(CompiledState { output, cache })
285    }
286}
287
288/// Represents the state after `solc` was successfully invoked
289#[derive(Debug)]
290struct CompiledState<'a, T: ArtifactOutput> {
291    output: AggregatedCompilerOutput,
292    cache: ArtifactsCache<'a, T>,
293}
294
295impl<'a, T: ArtifactOutput> CompiledState<'a, T> {
296    /// advance to the next state by handling all artifacts
297    ///
298    /// Writes all output contracts to disk if enabled in the `Project` and if the build was
299    /// successful
300    #[tracing::instrument(skip_all, name = "write-artifacts")]
301    fn write_artifacts(self) -> Result<ArtifactsState<'a, T>> {
302        let CompiledState { output, cache } = self;
303
304        let project = cache.project();
305        let ctx = cache.output_ctx();
306        // write all artifacts via the handler but only if the build succeeded and project wasn't
307        // configured with `no_artifacts == true`
308        let compiled_artifacts = if project.no_artifacts {
309            project.artifacts_handler().output_to_artifacts(
310                &output.contracts,
311                &output.sources,
312                ctx,
313                &project.paths,
314            )
315        } else if output.has_error(&project.ignored_error_codes, &project.compiler_severity_filter)
316        {
317            trace!("skip writing cache file due to solc errors: {:?}", output.errors);
318            project.artifacts_handler().output_to_artifacts(
319                &output.contracts,
320                &output.sources,
321                ctx,
322                &project.paths,
323            )
324        } else {
325            trace!(
326                "handling artifact output for {} contracts and {} sources",
327                output.contracts.len(),
328                output.sources.len()
329            );
330            // this emits the artifacts via the project's artifacts handler
331            let artifacts = project.artifacts_handler().on_output(
332                &output.contracts,
333                &output.sources,
334                &project.paths,
335                ctx,
336            )?;
337
338            // emits all the build infos, if they exist
339            output.write_build_infos(project.build_info_path())?;
340
341            artifacts
342        };
343
344        Ok(ArtifactsState { output, cache, compiled_artifacts })
345    }
346}
347
348/// Represents the state after all artifacts were written to disk
349#[derive(Debug)]
350struct ArtifactsState<'a, T: ArtifactOutput> {
351    output: AggregatedCompilerOutput,
352    cache: ArtifactsCache<'a, T>,
353    compiled_artifacts: Artifacts<T::Artifact>,
354}
355
356impl<'a, T: ArtifactOutput> ArtifactsState<'a, T> {
357    /// Writes the cache file
358    ///
359    /// this concludes the [`Project::compile()`] statemachine
360    fn write_cache(self) -> Result<ProjectCompileOutput<T>> {
361        let ArtifactsState { output, cache, compiled_artifacts } = self;
362        let project = cache.project();
363        let ignored_error_codes = project.ignored_error_codes.clone();
364        let compiler_severity_filter = project.compiler_severity_filter;
365        let has_error = output.has_error(&ignored_error_codes, &compiler_severity_filter);
366        let skip_write_to_disk = project.no_artifacts || has_error;
367        trace!(has_error, project.no_artifacts, skip_write_to_disk, cache_path=?project.cache_path(),"prepare writing cache file");
368
369        let cached_artifacts = cache.consume(&compiled_artifacts, !skip_write_to_disk)?;
370        Ok(ProjectCompileOutput {
371            compiler_output: output,
372            compiled_artifacts,
373            cached_artifacts,
374            ignored_error_codes,
375            compiler_severity_filter,
376        })
377    }
378}
379
380/// Determines how the `solc <-> sources` pairs are executed
381#[derive(Debug, Clone)]
382#[allow(dead_code)]
383enum CompilerSources {
384    /// Compile all these sequentially
385    Sequential(VersionedSources),
386    /// Compile all these in parallel using a certain amount of jobs
387    Parallel(VersionedSources, usize),
388}
389
390impl CompilerSources {
391    /// Converts all `\\` separators to `/`
392    ///
393    /// This effectively ensures that `solc` can find imported files like `/src/Cheats.sol` in the
394    /// VFS (the `CompilerInput` as json) under `src/Cheats.sol`.
395    fn slash_paths(&mut self) {
396        #[cfg(windows)]
397        {
398            use path_slash::PathBufExt;
399
400            fn slash_versioned_sources(v: &mut VersionedSources) {
401                for (_, (_, sources)) in v {
402                    *sources = std::mem::take(sources)
403                        .into_iter()
404                        .map(|(path, source)| {
405                            (PathBuf::from(path.to_slash_lossy().as_ref()), source)
406                        })
407                        .collect()
408                }
409            }
410
411            match self {
412                CompilerSources::Sequential(v) => slash_versioned_sources(v),
413                CompilerSources::Parallel(v, _) => slash_versioned_sources(v),
414            };
415        }
416    }
417    /// Filters out all sources that don't need to be compiled, see [`ArtifactsCache::filter`]
418    fn filtered<T: ArtifactOutput>(self, cache: &mut ArtifactsCache<T>) -> FilteredCompilerSources {
419        fn filtered_sources<T: ArtifactOutput>(
420            sources: VersionedSources,
421            cache: &mut ArtifactsCache<T>,
422        ) -> VersionedFilteredSources {
423            // fill all content hashes first so they're available for all source sets
424            sources.iter().for_each(|(_, (_, sources))| {
425                cache.fill_content_hashes(sources);
426            });
427
428            sources
429                .into_iter()
430                .map(|(solc, (version, sources))| {
431                    trace!("Filtering {} sources for {}", sources.len(), version);
432                    let sources = cache.filter(sources, &version);
433                    trace!(
434                        "Detected {} dirty sources {:?}",
435                        sources.dirty().count(),
436                        sources.dirty_files().collect::<Vec<_>>()
437                    );
438                    (solc, (version, sources))
439                })
440                .collect()
441        }
442
443        match self {
444            CompilerSources::Sequential(s) => {
445                FilteredCompilerSources::Sequential(filtered_sources(s, cache))
446            }
447            CompilerSources::Parallel(s, j) => {
448                FilteredCompilerSources::Parallel(filtered_sources(s, cache), j)
449            }
450        }
451    }
452}
453
454/// Determines how the `solc <-> sources` pairs are executed
455#[derive(Debug, Clone)]
456#[allow(dead_code)]
457enum FilteredCompilerSources {
458    /// Compile all these sequentially
459    Sequential(VersionedFilteredSources),
460    /// Compile all these in parallel using a certain amount of jobs
461    Parallel(VersionedFilteredSources, usize),
462}
463
464impl FilteredCompilerSources {
465    /// Compiles all the files with `Solc`
466    fn compile(
467        self,
468        settings: &Settings,
469        paths: &ProjectPathsConfig,
470        sparse_output: SparseOutputFilter,
471        graph: &GraphEdges,
472        create_build_info: bool,
473    ) -> Result<AggregatedCompilerOutput> {
474        match self {
475            FilteredCompilerSources::Sequential(input) => {
476                compile_sequential(input, settings, paths, sparse_output, graph, create_build_info)
477            }
478            FilteredCompilerSources::Parallel(input, j) => {
479                compile_parallel(input, j, settings, paths, sparse_output, graph, create_build_info)
480            }
481        }
482    }
483
484    #[cfg(test)]
485    #[allow(unused)]
486    fn sources(&self) -> &VersionedFilteredSources {
487        match self {
488            FilteredCompilerSources::Sequential(v) => v,
489            FilteredCompilerSources::Parallel(v, _) => v,
490        }
491    }
492}
493
494/// Compiles the input set sequentially and returns an aggregated set of the solc `CompilerOutput`s
495fn compile_sequential(
496    input: VersionedFilteredSources,
497    settings: &Settings,
498    paths: &ProjectPathsConfig,
499    sparse_output: SparseOutputFilter,
500    graph: &GraphEdges,
501    create_build_info: bool,
502) -> Result<AggregatedCompilerOutput> {
503    let mut aggregated = AggregatedCompilerOutput::default();
504    trace!("compiling {} jobs sequentially", input.len());
505    for (solc, (version, filtered_sources)) in input {
506        if filtered_sources.is_empty() {
507            // nothing to compile
508            trace!("skip solc {} {} for empty sources set", solc.as_ref().display(), version);
509            continue
510        }
511        trace!(
512            "compiling {} sources with solc \"{}\" {:?}",
513            filtered_sources.len(),
514            solc.as_ref().display(),
515            solc.args
516        );
517
518        let dirty_files: Vec<PathBuf> = filtered_sources.dirty_files().cloned().collect();
519
520        // depending on the composition of the filtered sources, the output selection can be
521        // optimized
522        let mut opt_settings = settings.clone();
523        let sources = sparse_output.sparse_sources(filtered_sources, &mut opt_settings, graph);
524
525        for input in CompilerInput::with_sources(sources) {
526            let actually_dirty = input
527                .sources
528                .keys()
529                .filter(|f| dirty_files.contains(f))
530                .cloned()
531                .collect::<Vec<_>>();
532            if actually_dirty.is_empty() {
533                // nothing to compile for this particular language, all dirty files are in the other
534                // language set
535                trace!(
536                    "skip solc {} {} compilation of {} compiler input due to empty source set",
537                    solc.as_ref().display(),
538                    version,
539                    input.language
540                );
541                continue
542            }
543            let input = input
544                .settings(opt_settings.clone())
545                .normalize_evm_version(&version)
546                .with_remappings(paths.remappings.clone())
547                .with_base_path(&paths.root)
548                .sanitized(&version);
549
550            trace!(
551                "calling solc `{}` with {} sources {:?}",
552                version,
553                input.sources.len(),
554                input.sources.keys()
555            );
556
557            let start = Instant::now();
558            report::solc_spawn(&solc, &version, &input, &actually_dirty);
559            let output = solc.compile(&input)?;
560            report::solc_success(&solc, &version, &output, &start.elapsed());
561            trace!("compiled input, output has error: {}", output.has_error());
562            trace!("received compiler output: {:?}", output.contracts.keys());
563
564            // if configured also create the build info
565            if create_build_info {
566                let build_info = RawBuildInfo::new(&input, &output, &version)?;
567                aggregated.build_infos.insert(version.clone(), build_info);
568            }
569
570            aggregated.extend(version.clone(), output);
571        }
572    }
573    Ok(aggregated)
574}
575
576/// compiles the input set using `num_jobs` threads
577fn compile_parallel(
578    input: VersionedFilteredSources,
579    num_jobs: usize,
580    settings: &Settings,
581    paths: &ProjectPathsConfig,
582    sparse_output: SparseOutputFilter,
583    graph: &GraphEdges,
584    create_build_info: bool,
585) -> Result<AggregatedCompilerOutput> {
586    debug_assert!(num_jobs > 1);
587    trace!("compile {} sources in parallel using up to {} solc jobs", input.len(), num_jobs);
588
589    let mut jobs = Vec::with_capacity(input.len());
590    for (solc, (version, filtered_sources)) in input {
591        if filtered_sources.is_empty() {
592            // nothing to compile
593            trace!("skip solc {} {} for empty sources set", solc.as_ref().display(), version);
594            continue
595        }
596
597        let dirty_files: Vec<PathBuf> = filtered_sources.dirty_files().cloned().collect();
598
599        // depending on the composition of the filtered sources, the output selection can be
600        // optimized
601        let mut opt_settings = settings.clone();
602        let sources = sparse_output.sparse_sources(filtered_sources, &mut opt_settings, graph);
603
604        for input in CompilerInput::with_sources(sources) {
605            let actually_dirty = input
606                .sources
607                .keys()
608                .filter(|f| dirty_files.contains(f))
609                .cloned()
610                .collect::<Vec<_>>();
611            if actually_dirty.is_empty() {
612                // nothing to compile for this particular language, all dirty files are in the other
613                // language set
614                trace!(
615                    "skip solc {} {} compilation of {} compiler input due to empty source set",
616                    solc.as_ref().display(),
617                    version,
618                    input.language
619                );
620                continue
621            }
622
623            let job = input
624                .settings(settings.clone())
625                .normalize_evm_version(&version)
626                .with_remappings(paths.remappings.clone())
627                .with_base_path(&paths.root)
628                .sanitized(&version);
629
630            jobs.push((solc.clone(), version.clone(), job, actually_dirty))
631        }
632    }
633
634    // need to get the currently installed reporter before installing the pool, otherwise each new
635    // thread in the pool will get initialized with the default value of the `thread_local!`'s
636    // localkey. This way we keep access to the reporter in the rayon pool
637    let scoped_report = report::get_default(|reporter| reporter.clone());
638
639    // start a rayon threadpool that will execute all `Solc::compile()` processes
640    let pool = rayon::ThreadPoolBuilder::new().num_threads(num_jobs).build().unwrap();
641
642    let outputs = pool.install(move || {
643        jobs.into_par_iter()
644            .map(move |(solc, version, input, actually_dirty)| {
645                // set the reporter on this thread
646                let _guard = report::set_scoped(&scoped_report);
647
648                trace!(
649                    "calling solc `{}` {:?} with {} sources: {:?}",
650                    version,
651                    solc.args,
652                    input.sources.len(),
653                    input.sources.keys()
654                );
655                let start = Instant::now();
656                report::solc_spawn(&solc, &version, &input, &actually_dirty);
657                solc.compile(&input).map(move |output| {
658                    report::solc_success(&solc, &version, &output, &start.elapsed());
659                    (version, input, output)
660                })
661            })
662            .collect::<Result<Vec<_>>>()
663    })?;
664
665    let mut aggregated = AggregatedCompilerOutput::default();
666    for (version, input, output) in outputs {
667        // if configured also create the build info
668        if create_build_info {
669            let build_info = RawBuildInfo::new(&input, &output, &version)?;
670            aggregated.build_infos.insert(version.clone(), build_info);
671        }
672        aggregated.extend(version, output);
673    }
674
675    Ok(aggregated)
676}
677
678#[cfg(test)]
679#[cfg(all(feature = "project-util", feature = "svm-solc"))]
680mod tests {
681    use super::*;
682    use crate::{project_util::TempProject, MinimalCombinedArtifacts};
683
684    #[allow(unused)]
685    fn init_tracing() {
686        let _ = tracing_subscriber::fmt()
687            .with_env_filter(tracing_subscriber::EnvFilter::from_default_env())
688            .try_init()
689            .ok();
690    }
691
692    #[test]
693    fn can_preprocess() {
694        let root = PathBuf::from(env!("CARGO_MANIFEST_DIR")).join("test-data/dapp-sample");
695        let project =
696            Project::builder().paths(ProjectPathsConfig::dapptools(root).unwrap()).build().unwrap();
697
698        let compiler = ProjectCompiler::new(&project).unwrap();
699        let prep = compiler.preprocess().unwrap();
700        let cache = prep.cache.as_cached().unwrap();
701        // 3 contracts
702        assert_eq!(cache.dirty_source_files.len(), 3);
703        assert!(cache.filtered.is_empty());
704        assert!(cache.cache.is_empty());
705
706        let compiled = prep.compile().unwrap();
707        assert_eq!(compiled.output.contracts.files().count(), 3);
708    }
709
710    #[test]
711    fn can_detect_cached_files() {
712        let root = PathBuf::from(env!("CARGO_MANIFEST_DIR")).join("test-data/dapp-sample");
713        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
714        let project = TempProject::<MinimalCombinedArtifacts>::new(paths).unwrap();
715
716        let compiled = project.compile().unwrap();
717        compiled.assert_success();
718
719        let inner = project.project();
720        let compiler = ProjectCompiler::new(inner).unwrap();
721        let prep = compiler.preprocess().unwrap();
722        assert!(prep.cache.as_cached().unwrap().dirty_source_files.is_empty())
723    }
724
725    #[test]
726    fn can_recompile_with_optimized_output() {
727        let tmp = TempProject::dapptools().unwrap();
728
729        tmp.add_source(
730            "A",
731            r#"
732    pragma solidity ^0.8.10;
733    import "./B.sol";
734    contract A {}
735   "#,
736        )
737        .unwrap();
738
739        tmp.add_source(
740            "B",
741            r#"
742    pragma solidity ^0.8.10;
743    contract B {
744        function hello() public {}
745    }
746    import "./C.sol";
747   "#,
748        )
749        .unwrap();
750
751        tmp.add_source(
752            "C",
753            r"
754    pragma solidity ^0.8.10;
755    contract C {
756            function hello() public {}
757    }
758   ",
759        )
760        .unwrap();
761        let compiled = tmp.compile().unwrap();
762        compiled.assert_success();
763
764        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
765
766        // modify A.sol
767        tmp.add_source(
768            "A",
769            r#"
770    pragma solidity ^0.8.10;
771    import "./B.sol";
772    contract A {
773        function testExample() public {}
774    }
775   "#,
776        )
777        .unwrap();
778
779        let compiler = ProjectCompiler::new(tmp.project()).unwrap();
780        let state = compiler.preprocess().unwrap();
781        let sources = state.sources.sources();
782
783        // single solc
784        assert_eq!(sources.len(), 1);
785
786        let (_, filtered) = sources.values().next().unwrap();
787
788        // 3 contracts total
789        assert_eq!(filtered.0.len(), 3);
790        // A is modified
791        assert_eq!(filtered.dirty().count(), 1);
792        assert!(filtered.dirty_files().next().unwrap().ends_with("A.sol"));
793
794        let state = state.compile().unwrap();
795        assert_eq!(state.output.sources.len(), 3);
796        for (f, source) in state.output.sources.sources() {
797            if f.ends_with("A.sol") {
798                assert!(source.ast.is_some());
799            } else {
800                assert!(source.ast.is_none());
801            }
802        }
803
804        assert_eq!(state.output.contracts.len(), 1);
805        let (a, c) = state.output.contracts_iter().next().unwrap();
806        assert_eq!(a, "A");
807        assert!(c.abi.is_some() && c.evm.is_some());
808
809        let state = state.write_artifacts().unwrap();
810        assert_eq!(state.compiled_artifacts.as_ref().len(), 1);
811
812        let out = state.write_cache().unwrap();
813
814        let artifacts: Vec<_> = out.into_artifacts().collect();
815        assert_eq!(artifacts.len(), 3);
816        for (_, artifact) in artifacts {
817            let c = artifact.into_contract_bytecode();
818            assert!(c.abi.is_some() && c.bytecode.is_some() && c.deployed_bytecode.is_some());
819        }
820
821        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
822    }
823
824    #[test]
825    #[ignore]
826    fn can_compile_real_project() {
827        init_tracing();
828        let paths = ProjectPathsConfig::builder()
829            .root("../../foundry-integration-tests/testdata/solmate")
830            .build()
831            .unwrap();
832        let project = Project::builder().paths(paths).build().unwrap();
833        let compiler = ProjectCompiler::new(&project).unwrap();
834        let _out = compiler.compile().unwrap();
835    }
836}