foundry_compilers/compile/
project.rs

1//! Manages compiling of a `Project`
2//!
3//! The compilation of a project is performed in several steps.
4//!
5//! First the project's dependency graph [`crate::Graph`] is constructed and all imported
6//! dependencies are resolved. The graph holds all the relationships between the files and their
7//! versions. From there the appropriate version set is derived
8//! [`crate::Graph`] which need to be compiled with different
9//! [`crate::compilers::solc::Solc`] versions.
10//!
11//! At this point we check if we need to compile a source file or whether we can reuse an _existing_
12//! `Artifact`. We don't to compile if:
13//!     - caching is enabled
14//!     - the file is **not** dirty
15//!     - the artifact for that file exists
16//!
17//! This concludes the preprocessing, and we now have either
18//!    - only `Source` files that need to be compiled
19//!    - only cached `Artifacts`, compilation can be skipped. This is considered an unchanged,
20//!      cached project
21//!    - Mix of both `Source` and `Artifacts`, only the `Source` files need to be compiled, the
22//!      `Artifacts` can be reused.
23//!
24//! The final step is invoking `Solc` via the standard JSON format.
25//!
26//! ### Notes on [Import Path Resolution](https://docs.soliditylang.org/en/develop/path-resolution.html#path-resolution)
27//!
28//! In order to be able to support reproducible builds on all platforms, the Solidity compiler has
29//! to abstract away the details of the filesystem where source files are stored. Paths used in
30//! imports must work the same way everywhere while the command-line interface must be able to work
31//! with platform-specific paths to provide good user experience. This section aims to explain in
32//! detail how Solidity reconciles these requirements.
33//!
34//! The compiler maintains an internal database (virtual filesystem or VFS for short) where each
35//! source unit is assigned a unique source unit name which is an opaque and unstructured
36//! identifier. When you use the import statement, you specify an import path that references a
37//! source unit name. If the compiler does not find any source unit name matching the import path in
38//! the VFS, it invokes the callback, which is responsible for obtaining the source code to be
39//! placed under that name.
40//!
41//! This becomes relevant when dealing with resolved imports
42//!
43//! #### Relative Imports
44//!
45//! ```solidity
46//! import "./math/math.sol";
47//! import "contracts/tokens/token.sol";
48//! ```
49//! In the above `./math/math.sol` and `contracts/tokens/token.sol` are import paths while the
50//! source unit names they translate to are `contracts/math/math.sol` and
51//! `contracts/tokens/token.sol` respectively.
52//!
53//! #### Direct Imports
54//!
55//! An import that does not start with `./` or `../` is a direct import.
56//!
57//! ```solidity
58//! import "/project/lib/util.sol";         // source unit name: /project/lib/util.sol
59//! import "lib/util.sol";                  // source unit name: lib/util.sol
60//! import "@openzeppelin/address.sol";     // source unit name: @openzeppelin/address.sol
61//! import "https://example.com/token.sol"; // source unit name: <https://example.com/token.sol>
62//! ```
63//!
64//! After applying any import remappings the import path simply becomes the source unit name.
65//!
66//! ##### Import Remapping
67//!
68//! ```solidity
69//! import "github.com/ethereum/dapp-bin/library/math.sol"; // source unit name: dapp-bin/library/math.sol
70//! ```
71//!
72//! If compiled with `solc github.com/ethereum/dapp-bin/=dapp-bin/` the compiler will look for the
73//! file in the VFS under `dapp-bin/library/math.sol`. If the file is not available there, the
74//! source unit name will be passed to the Host Filesystem Loader, which will then look in
75//! `/project/dapp-bin/library/iterable_mapping.sol`
76//!
77//!
78//! ### Caching and Change detection
79//!
80//! If caching is enabled in the [Project] a cache file will be created upon a successful solc
81//! build. The [cache file](crate::cache::CompilerCache) stores metadata for all the files that were
82//! provided to solc.
83//! For every file the cache file contains a dedicated [cache entry](crate::cache::CacheEntry),
84//! which represents the state of the file. A solidity file can contain several contracts, for every
85//! contract a separate [artifact](crate::Artifact) is emitted. Therefore the entry also tracks all
86//! artifacts emitted by a file. A solidity file can also be compiled with several solc versions.
87//!
88//! For example in `A(<=0.8.10) imports C(>0.4.0)` and
89//! `B(0.8.11) imports C(>0.4.0)`, both `A` and `B` import `C` but there's no solc version that's
90//! compatible with `A` and `B`, in which case two sets are compiled: [`A`, `C`] and [`B`, `C`].
91//! This is reflected in the cache entry which tracks the file's artifacts by version.
92//!
93//! The cache makes it possible to detect changes during recompilation, so that only the changed,
94//! dirty, files need to be passed to solc. A file will be considered as dirty if:
95//!   - the file is new, not included in the existing cache
96//!   - the file was modified since the last compiler run, detected by comparing content hashes
97//!   - any of the imported files is dirty
98//!   - the file's artifacts don't exist, were deleted.
99//!
100//! Recompiling a project with cache enabled detects all files that meet these criteria and provides
101//! solc with only these dirty files instead of the entire source set.
102
103use crate::{
104    artifact_output::Artifacts,
105    buildinfo::RawBuildInfo,
106    cache::ArtifactsCache,
107    compilers::{Compiler, CompilerInput, CompilerOutput, Language},
108    filter::SparseOutputFilter,
109    output::{AggregatedCompilerOutput, Builds},
110    report,
111    resolver::{GraphEdges, ResolvedSources},
112    ArtifactOutput, CompilerSettings, Graph, Project, ProjectCompileOutput, ProjectPathsConfig,
113    Sources,
114};
115use foundry_compilers_core::error::Result;
116use rayon::prelude::*;
117use semver::Version;
118use std::{
119    collections::{HashMap, HashSet},
120    fmt::Debug,
121    path::PathBuf,
122    time::Instant,
123};
124
125/// A set of different Solc installations with their version and the sources to be compiled
126pub(crate) type VersionedSources<'a, L, S> = HashMap<L, Vec<(Version, Sources, (&'a str, &'a S))>>;
127
128/// Invoked before the actual compiler invocation and can override the input.
129///
130/// Updates the list of identified cached mocks (if any) to be stored in cache and updates the
131/// compiler input.
132pub trait Preprocessor<C: Compiler>: Debug {
133    fn preprocess(
134        &self,
135        compiler: &C,
136        input: &mut C::Input,
137        paths: &ProjectPathsConfig<C::Language>,
138        mocks: &mut HashSet<PathBuf>,
139    ) -> Result<()>;
140}
141
142#[derive(Debug)]
143pub struct ProjectCompiler<
144    'a,
145    T: ArtifactOutput<CompilerContract = C::CompilerContract>,
146    C: Compiler,
147> {
148    /// Contains the relationship of the source files and their imports
149    edges: GraphEdges<C::ParsedSource>,
150    project: &'a Project<C, T>,
151    /// A mapping from a source file path to the primary profile name selected for it.
152    primary_profiles: HashMap<PathBuf, &'a str>,
153    /// how to compile all the sources
154    sources: CompilerSources<'a, C::Language, C::Settings>,
155    /// Optional preprocessor
156    preprocessor: Option<Box<dyn Preprocessor<C>>>,
157}
158
159impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
160    ProjectCompiler<'a, T, C>
161{
162    /// Create a new `ProjectCompiler` to bootstrap the compilation process of the project's
163    /// sources.
164    pub fn new(project: &'a Project<C, T>) -> Result<Self> {
165        Self::with_sources(project, project.paths.read_input_files()?)
166    }
167
168    /// Bootstraps the compilation process by resolving the dependency graph of all sources and the
169    /// appropriate `Solc` -> `Sources` set as well as the compile mode to use (parallel,
170    /// sequential)
171    ///
172    /// Multiple (`Solc` -> `Sources`) pairs can be compiled in parallel if the `Project` allows
173    /// multiple `jobs`, see [`crate::Project::set_solc_jobs()`].
174    pub fn with_sources(project: &'a Project<C, T>, mut sources: Sources) -> Result<Self> {
175        if let Some(filter) = &project.sparse_output {
176            sources.retain(|f, _| filter.is_match(f))
177        }
178        let graph = Graph::resolve_sources(&project.paths, sources)?;
179        let ResolvedSources { sources, primary_profiles, edges } =
180            graph.into_sources_by_version(project)?;
181
182        // If there are multiple different versions, and we can use multiple jobs we can compile
183        // them in parallel.
184        let jobs_cnt = || sources.values().map(|v| v.len()).sum::<usize>();
185        let sources = CompilerSources {
186            jobs: (project.solc_jobs > 1 && jobs_cnt() > 1).then_some(project.solc_jobs),
187            sources,
188        };
189
190        Ok(Self { edges, primary_profiles, project, sources, preprocessor: None })
191    }
192
193    pub fn with_preprocessor(self, preprocessor: impl Preprocessor<C> + 'static) -> Self {
194        Self { preprocessor: Some(Box::new(preprocessor)), ..self }
195    }
196
197    /// Compiles all the sources of the `Project` in the appropriate mode
198    ///
199    /// If caching is enabled, the sources are filtered and only _dirty_ sources are recompiled.
200    ///
201    /// The output of the compile process can be a mix of reused artifacts and freshly compiled
202    /// `Contract`s
203    ///
204    /// # Examples
205    /// ```no_run
206    /// use foundry_compilers::Project;
207    ///
208    /// let project = Project::builder().build(Default::default())?;
209    /// let output = project.compile()?;
210    /// # Ok::<(), Box<dyn std::error::Error>>(())
211    /// ```
212    pub fn compile(self) -> Result<ProjectCompileOutput<C, T>> {
213        let slash_paths = self.project.slash_paths;
214
215        // drive the compiler statemachine to completion
216        let mut output = self.preprocess()?.compile()?.write_artifacts()?.write_cache()?;
217
218        if slash_paths {
219            // ensures we always use `/` paths
220            output.slash_paths();
221        }
222
223        Ok(output)
224    }
225
226    /// Does basic preprocessing
227    ///   - sets proper source unit names
228    ///   - check cache
229    fn preprocess(self) -> Result<PreprocessedState<'a, T, C>> {
230        trace!("preprocessing");
231        let Self { edges, project, mut sources, primary_profiles, preprocessor } = self;
232
233        // convert paths on windows to ensure consistency with the `CompilerOutput` `solc` emits,
234        // which is unix style `/`
235        sources.slash_paths();
236
237        let mut cache = ArtifactsCache::new(project, edges, preprocessor.is_some())?;
238        // retain and compile only dirty sources and all their imports
239        sources.filter(&mut cache);
240
241        Ok(PreprocessedState { sources, cache, primary_profiles, preprocessor })
242    }
243}
244
245/// A series of states that comprise the [`ProjectCompiler::compile()`] state machine
246///
247/// The main reason is to debug all states individually
248#[derive(Debug)]
249struct PreprocessedState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
250{
251    /// Contains all the sources to compile.
252    sources: CompilerSources<'a, C::Language, C::Settings>,
253
254    /// Cache that holds `CacheEntry` objects if caching is enabled and the project is recompiled
255    cache: ArtifactsCache<'a, T, C>,
256
257    /// A mapping from a source file path to the primary profile name selected for it.
258    primary_profiles: HashMap<PathBuf, &'a str>,
259
260    /// Optional preprocessor
261    preprocessor: Option<Box<dyn Preprocessor<C>>>,
262}
263
264impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
265    PreprocessedState<'a, T, C>
266{
267    /// advance to the next state by compiling all sources
268    fn compile(self) -> Result<CompiledState<'a, T, C>> {
269        trace!("compiling");
270        let PreprocessedState { sources, mut cache, primary_profiles, preprocessor } = self;
271
272        let mut output = sources.compile(&mut cache, preprocessor)?;
273
274        // source paths get stripped before handing them over to solc, so solc never uses absolute
275        // paths, instead `--base-path <root dir>` is set. this way any metadata that's derived from
276        // data (paths) is relative to the project dir and should be independent of the current OS
277        // disk. However internally we still want to keep absolute paths, so we join the
278        // contracts again
279        output.join_all(cache.project().root());
280
281        Ok(CompiledState { output, cache, primary_profiles })
282    }
283}
284
285/// Represents the state after `solc` was successfully invoked
286#[derive(Debug)]
287struct CompiledState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
288    output: AggregatedCompilerOutput<C>,
289    cache: ArtifactsCache<'a, T, C>,
290    primary_profiles: HashMap<PathBuf, &'a str>,
291}
292
293impl<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
294    CompiledState<'a, T, C>
295{
296    /// advance to the next state by handling all artifacts
297    ///
298    /// Writes all output contracts to disk if enabled in the `Project` and if the build was
299    /// successful
300    #[instrument(skip_all, name = "write-artifacts")]
301    fn write_artifacts(self) -> Result<ArtifactsState<'a, T, C>> {
302        let CompiledState { output, cache, primary_profiles } = self;
303
304        let project = cache.project();
305        let ctx = cache.output_ctx();
306        // write all artifacts via the handler but only if the build succeeded and project wasn't
307        // configured with `no_artifacts == true`
308        let compiled_artifacts = if project.no_artifacts {
309            project.artifacts_handler().output_to_artifacts(
310                &output.contracts,
311                &output.sources,
312                ctx,
313                &project.paths,
314                &primary_profiles,
315            )
316        } else if output.has_error(
317            &project.ignored_error_codes,
318            &project.ignored_file_paths,
319            &project.compiler_severity_filter,
320        ) {
321            trace!("skip writing cache file due to solc errors: {:?}", output.errors);
322            project.artifacts_handler().output_to_artifacts(
323                &output.contracts,
324                &output.sources,
325                ctx,
326                &project.paths,
327                &primary_profiles,
328            )
329        } else {
330            trace!(
331                "handling artifact output for {} contracts and {} sources",
332                output.contracts.len(),
333                output.sources.len()
334            );
335            // this emits the artifacts via the project's artifacts handler
336            let artifacts = project.artifacts_handler().on_output(
337                &output.contracts,
338                &output.sources,
339                &project.paths,
340                ctx,
341                &primary_profiles,
342            )?;
343
344            // emits all the build infos, if they exist
345            output.write_build_infos(project.build_info_path())?;
346
347            artifacts
348        };
349
350        Ok(ArtifactsState { output, cache, compiled_artifacts })
351    }
352}
353
354/// Represents the state after all artifacts were written to disk
355#[derive(Debug)]
356struct ArtifactsState<'a, T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler> {
357    output: AggregatedCompilerOutput<C>,
358    cache: ArtifactsCache<'a, T, C>,
359    compiled_artifacts: Artifacts<T::Artifact>,
360}
361
362impl<T: ArtifactOutput<CompilerContract = C::CompilerContract>, C: Compiler>
363    ArtifactsState<'_, T, C>
364{
365    /// Writes the cache file
366    ///
367    /// this concludes the [`Project::compile()`] statemachine
368    fn write_cache(self) -> Result<ProjectCompileOutput<C, T>> {
369        let ArtifactsState { output, cache, compiled_artifacts } = self;
370        let project = cache.project();
371        let ignored_error_codes = project.ignored_error_codes.clone();
372        let ignored_file_paths = project.ignored_file_paths.clone();
373        let compiler_severity_filter = project.compiler_severity_filter;
374        let has_error =
375            output.has_error(&ignored_error_codes, &ignored_file_paths, &compiler_severity_filter);
376        let skip_write_to_disk = project.no_artifacts || has_error;
377        trace!(has_error, project.no_artifacts, skip_write_to_disk, cache_path=?project.cache_path(),"prepare writing cache file");
378
379        let (cached_artifacts, cached_builds) =
380            cache.consume(&compiled_artifacts, &output.build_infos, !skip_write_to_disk)?;
381
382        project.artifacts_handler().handle_cached_artifacts(&cached_artifacts)?;
383
384        let builds = Builds(
385            output
386                .build_infos
387                .iter()
388                .map(|build_info| (build_info.id.clone(), build_info.build_context.clone()))
389                .chain(cached_builds)
390                .map(|(id, context)| (id, context.with_joined_paths(project.paths.root.as_path())))
391                .collect(),
392        );
393
394        Ok(ProjectCompileOutput {
395            compiler_output: output,
396            compiled_artifacts,
397            cached_artifacts,
398            ignored_error_codes,
399            ignored_file_paths,
400            compiler_severity_filter,
401            builds,
402        })
403    }
404}
405
406/// Determines how the `solc <-> sources` pairs are executed.
407#[derive(Debug, Clone)]
408struct CompilerSources<'a, L, S> {
409    /// The sources to compile.
410    sources: VersionedSources<'a, L, S>,
411    /// The number of jobs to use for parallel compilation.
412    jobs: Option<usize>,
413}
414
415impl<L: Language, S: CompilerSettings> CompilerSources<'_, L, S> {
416    /// Converts all `\\` separators to `/`.
417    ///
418    /// This effectively ensures that `solc` can find imported files like `/src/Cheats.sol` in the
419    /// VFS (the `CompilerInput` as json) under `src/Cheats.sol`.
420    fn slash_paths(&mut self) {
421        #[cfg(windows)]
422        {
423            use path_slash::PathBufExt;
424
425            self.sources.values_mut().for_each(|versioned_sources| {
426                versioned_sources.iter_mut().for_each(|(_, sources, _)| {
427                    *sources = std::mem::take(sources)
428                        .into_iter()
429                        .map(|(path, source)| {
430                            (PathBuf::from(path.to_slash_lossy().as_ref()), source)
431                        })
432                        .collect()
433                })
434            });
435        }
436    }
437
438    /// Filters out all sources that don't need to be compiled, see [`ArtifactsCache::filter`]
439    fn filter<
440        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
441        C: Compiler<Language = L>,
442    >(
443        &mut self,
444        cache: &mut ArtifactsCache<'_, T, C>,
445    ) {
446        cache.remove_dirty_sources();
447        for versioned_sources in self.sources.values_mut() {
448            for (version, sources, (profile, _)) in versioned_sources {
449                trace!("Filtering {} sources for {}", sources.len(), version);
450                cache.filter(sources, version, profile);
451                trace!(
452                    "Detected {} sources to compile {:?}",
453                    sources.dirty().count(),
454                    sources.dirty_files().collect::<Vec<_>>()
455                );
456            }
457        }
458    }
459
460    /// Compiles all the files with `Solc`
461    fn compile<
462        C: Compiler<Language = L, Settings = S>,
463        T: ArtifactOutput<CompilerContract = C::CompilerContract>,
464    >(
465        self,
466        cache: &mut ArtifactsCache<'_, T, C>,
467        preprocessor: Option<Box<dyn Preprocessor<C>>>,
468    ) -> Result<AggregatedCompilerOutput<C>> {
469        let project = cache.project();
470        let graph = cache.graph();
471
472        let jobs_cnt = self.jobs;
473
474        let sparse_output = SparseOutputFilter::new(project.sparse_output.as_deref());
475
476        // Include additional paths collected during graph resolution.
477        let mut include_paths = project.paths.include_paths.clone();
478        include_paths.extend(graph.include_paths().clone());
479
480        // Get current list of mocks from cache. This will be passed to preprocessors and updated
481        // accordingly, then set back in cache.
482        let mut mocks = cache.mocks();
483
484        let mut jobs = Vec::new();
485        for (language, versioned_sources) in self.sources {
486            for (version, sources, (profile, opt_settings)) in versioned_sources {
487                let mut opt_settings = opt_settings.clone();
488                if sources.is_empty() {
489                    // nothing to compile
490                    trace!("skip {} for empty sources set", version);
491                    continue;
492                }
493
494                // depending on the composition of the filtered sources, the output selection can be
495                // optimized
496                let actually_dirty =
497                    sparse_output.sparse_sources(&sources, &mut opt_settings, graph);
498
499                if actually_dirty.is_empty() {
500                    // nothing to compile for this particular language, all dirty files are in the
501                    // other language set
502                    trace!("skip {} run due to empty source set", version);
503                    continue;
504                }
505
506                trace!("calling {} with {} sources {:?}", version, sources.len(), sources.keys());
507
508                let settings = opt_settings
509                    .with_base_path(&project.paths.root)
510                    .with_allow_paths(&project.paths.allowed_paths)
511                    .with_include_paths(&include_paths)
512                    .with_remappings(&project.paths.remappings);
513
514                let mut input = C::Input::build(sources, settings, language, version.clone());
515
516                input.strip_prefix(project.paths.root.as_path());
517
518                if let Some(preprocessor) = preprocessor.as_ref() {
519                    preprocessor.preprocess(
520                        &project.compiler,
521                        &mut input,
522                        &project.paths,
523                        &mut mocks,
524                    )?;
525                }
526
527                jobs.push((input, profile, actually_dirty));
528            }
529        }
530
531        // Update cache with mocks updated by preprocessors.
532        cache.update_mocks(mocks);
533
534        let results = if let Some(num_jobs) = jobs_cnt {
535            compile_parallel(&project.compiler, jobs, num_jobs)
536        } else {
537            compile_sequential(&project.compiler, jobs)
538        }?;
539
540        let mut aggregated = AggregatedCompilerOutput::default();
541
542        for (input, mut output, profile, actually_dirty) in results {
543            let version = input.version();
544
545            // Mark all files as seen by the compiler
546            for file in &actually_dirty {
547                cache.compiler_seen(file);
548            }
549
550            let build_info = RawBuildInfo::new(&input, &output, project.build_info)?;
551
552            output.retain_files(
553                actually_dirty
554                    .iter()
555                    .map(|f| f.strip_prefix(project.paths.root.as_path()).unwrap_or(f)),
556            );
557            output.join_all(project.paths.root.as_path());
558
559            aggregated.extend(version.clone(), build_info, profile, output);
560        }
561
562        Ok(aggregated)
563    }
564}
565
566type CompilationResult<'a, I, E, C> = Result<Vec<(I, CompilerOutput<E, C>, &'a str, Vec<PathBuf>)>>;
567
568/// Compiles the input set sequentially and returns a [Vec] of outputs.
569fn compile_sequential<'a, C: Compiler>(
570    compiler: &C,
571    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
572) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
573    jobs.into_iter()
574        .map(|(input, profile, actually_dirty)| {
575            let start = Instant::now();
576            report::compiler_spawn(
577                &input.compiler_name(),
578                input.version(),
579                actually_dirty.as_slice(),
580            );
581            let output = compiler.compile(&input)?;
582            report::compiler_success(&input.compiler_name(), input.version(), &start.elapsed());
583
584            Ok((input, output, profile, actually_dirty))
585        })
586        .collect()
587}
588
589/// compiles the input set using `num_jobs` threads
590fn compile_parallel<'a, C: Compiler>(
591    compiler: &C,
592    jobs: Vec<(C::Input, &'a str, Vec<PathBuf>)>,
593    num_jobs: usize,
594) -> CompilationResult<'a, C::Input, C::CompilationError, C::CompilerContract> {
595    // need to get the currently installed reporter before installing the pool, otherwise each new
596    // thread in the pool will get initialized with the default value of the `thread_local!`'s
597    // localkey. This way we keep access to the reporter in the rayon pool
598    let scoped_report = report::get_default(|reporter| reporter.clone());
599
600    // start a rayon threadpool that will execute all `Solc::compile()` processes
601    let pool = rayon::ThreadPoolBuilder::new().num_threads(num_jobs).build().unwrap();
602
603    pool.install(move || {
604        jobs.into_par_iter()
605            .map(move |(input, profile, actually_dirty)| {
606                // set the reporter on this thread
607                let _guard = report::set_scoped(&scoped_report);
608
609                let start = Instant::now();
610                report::compiler_spawn(
611                    &input.compiler_name(),
612                    input.version(),
613                    actually_dirty.as_slice(),
614                );
615                compiler.compile(&input).map(move |output| {
616                    report::compiler_success(
617                        &input.compiler_name(),
618                        input.version(),
619                        &start.elapsed(),
620                    );
621                    (input, output, profile, actually_dirty)
622                })
623            })
624            .collect()
625    })
626}
627
628#[cfg(test)]
629#[cfg(all(feature = "project-util", feature = "svm-solc"))]
630mod tests {
631    use std::path::Path;
632
633    use foundry_compilers_artifacts::output_selection::ContractOutputSelection;
634
635    use crate::{
636        compilers::multi::MultiCompiler, project_util::TempProject, ConfigurableArtifacts,
637        MinimalCombinedArtifacts, ProjectPathsConfig,
638    };
639
640    use super::*;
641
642    fn init_tracing() {
643        let _ = tracing_subscriber::fmt()
644            .with_env_filter(tracing_subscriber::EnvFilter::from_default_env())
645            .try_init()
646            .ok();
647    }
648
649    #[test]
650    fn can_preprocess() {
651        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
652        let project = Project::builder()
653            .paths(ProjectPathsConfig::dapptools(&root).unwrap())
654            .build(Default::default())
655            .unwrap();
656
657        let compiler = ProjectCompiler::new(&project).unwrap();
658        let prep = compiler.preprocess().unwrap();
659        let cache = prep.cache.as_cached().unwrap();
660        // ensure that we have exactly 3 empty entries which will be filled on compilation.
661        assert_eq!(cache.cache.files.len(), 3);
662        assert!(cache.cache.files.values().all(|v| v.artifacts.is_empty()));
663
664        let compiled = prep.compile().unwrap();
665        assert_eq!(compiled.output.contracts.files().count(), 3);
666    }
667
668    #[test]
669    fn can_detect_cached_files() {
670        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
671        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
672        let project = TempProject::<MultiCompiler, MinimalCombinedArtifacts>::new(paths).unwrap();
673
674        let compiled = project.compile().unwrap();
675        compiled.assert_success();
676
677        let inner = project.project();
678        let compiler = ProjectCompiler::new(inner).unwrap();
679        let prep = compiler.preprocess().unwrap();
680        assert!(prep.cache.as_cached().unwrap().dirty_sources.is_empty())
681    }
682
683    #[test]
684    fn can_recompile_with_optimized_output() {
685        let tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
686
687        tmp.add_source(
688            "A",
689            r#"
690    pragma solidity ^0.8.10;
691    import "./B.sol";
692    contract A {}
693   "#,
694        )
695        .unwrap();
696
697        tmp.add_source(
698            "B",
699            r#"
700    pragma solidity ^0.8.10;
701    contract B {
702        function hello() public {}
703    }
704    import "./C.sol";
705   "#,
706        )
707        .unwrap();
708
709        tmp.add_source(
710            "C",
711            r"
712    pragma solidity ^0.8.10;
713    contract C {
714            function hello() public {}
715    }
716   ",
717        )
718        .unwrap();
719        let compiled = tmp.compile().unwrap();
720        compiled.assert_success();
721
722        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
723
724        // modify A.sol
725        tmp.add_source(
726            "A",
727            r#"
728    pragma solidity ^0.8.10;
729    import "./B.sol";
730    contract A {
731        function testExample() public {}
732    }
733   "#,
734        )
735        .unwrap();
736
737        let compiler = ProjectCompiler::new(tmp.project()).unwrap();
738        let state = compiler.preprocess().unwrap();
739        let sources = &state.sources.sources;
740
741        let cache = state.cache.as_cached().unwrap();
742
743        // 2 clean sources
744        assert_eq!(cache.cache.artifacts_len(), 2);
745        assert!(cache.cache.all_artifacts_exist());
746        assert_eq!(cache.dirty_sources.len(), 1);
747
748        let len = sources.values().map(|v| v.len()).sum::<usize>();
749        // single solc
750        assert_eq!(len, 1);
751
752        let filtered = &sources.values().next().unwrap()[0].1;
753
754        // 3 contracts total
755        assert_eq!(filtered.0.len(), 3);
756        // A is modified
757        assert_eq!(filtered.dirty().count(), 1);
758        assert!(filtered.dirty_files().next().unwrap().ends_with("A.sol"));
759
760        let state = state.compile().unwrap();
761        assert_eq!(state.output.sources.len(), 1);
762        for (f, source) in state.output.sources.sources() {
763            if f.ends_with("A.sol") {
764                assert!(source.ast.is_some());
765            } else {
766                assert!(source.ast.is_none());
767            }
768        }
769
770        assert_eq!(state.output.contracts.len(), 1);
771        let (a, c) = state.output.contracts_iter().next().unwrap();
772        assert_eq!(a, "A");
773        assert!(c.abi.is_some() && c.evm.is_some());
774
775        let state = state.write_artifacts().unwrap();
776        assert_eq!(state.compiled_artifacts.as_ref().len(), 1);
777
778        let out = state.write_cache().unwrap();
779
780        let artifacts: Vec<_> = out.into_artifacts().collect();
781        assert_eq!(artifacts.len(), 3);
782        for (_, artifact) in artifacts {
783            let c = artifact.into_contract_bytecode();
784            assert!(c.abi.is_some() && c.bytecode.is_some() && c.deployed_bytecode.is_some());
785        }
786
787        tmp.artifacts_snapshot().unwrap().assert_artifacts_essentials_present();
788    }
789
790    #[test]
791    #[ignore]
792    fn can_compile_real_project() {
793        init_tracing();
794        let paths = ProjectPathsConfig::builder()
795            .root("../../foundry-integration-tests/testdata/solmate")
796            .build()
797            .unwrap();
798        let project = Project::builder().paths(paths).build(Default::default()).unwrap();
799        let compiler = ProjectCompiler::new(&project).unwrap();
800        let _out = compiler.compile().unwrap();
801    }
802
803    #[test]
804    fn extra_output_cached() {
805        let root = Path::new(env!("CARGO_MANIFEST_DIR")).join("../../test-data/dapp-sample");
806        let paths = ProjectPathsConfig::builder().sources(root.join("src")).lib(root.join("lib"));
807        let mut project = TempProject::<MultiCompiler>::new(paths).unwrap();
808
809        // Compile once without enabled extra output
810        project.compile().unwrap();
811
812        // Enable extra output of abi
813        project.project_mut().artifacts =
814            ConfigurableArtifacts::new([], [ContractOutputSelection::Abi]);
815
816        // Ensure that abi appears after compilation and that we didn't recompile anything
817        let abi_path = project.project().paths.artifacts.join("Dapp.sol/Dapp.abi.json");
818        assert!(!abi_path.exists());
819        let output = project.compile().unwrap();
820        assert!(output.compiler_output.is_empty());
821        assert!(abi_path.exists());
822    }
823
824    #[test]
825    fn can_compile_leftovers_after_sparse() {
826        let mut tmp = TempProject::<MultiCompiler, ConfigurableArtifacts>::dapptools().unwrap();
827
828        tmp.add_source(
829            "A",
830            r#"
831pragma solidity ^0.8.10;
832import "./B.sol";
833contract A {}
834"#,
835        )
836        .unwrap();
837
838        tmp.add_source(
839            "B",
840            r#"
841pragma solidity ^0.8.10;
842contract B {}
843"#,
844        )
845        .unwrap();
846
847        tmp.project_mut().sparse_output = Some(Box::new(|f: &Path| f.ends_with("A.sol")));
848        let compiled = tmp.compile().unwrap();
849        compiled.assert_success();
850        assert_eq!(compiled.artifacts().count(), 1);
851
852        tmp.project_mut().sparse_output = None;
853        let compiled = tmp.compile().unwrap();
854        compiled.assert_success();
855        assert_eq!(compiled.artifacts().count(), 2);
856    }
857}