Last week I gave a talk outlining how the Rust compiler works internally. Though the slides might be helpful & there was some interesting discussion after the talk, I thought a more detailed post might do this topic more justice.

Note that as of writing this the current release of Rust is 0.10. Some of what I cover in this article may well change by the time you get around to reading this.

Also note that although I’ve made some minor contributions to Rust, I don’t pretend to know everything there is to know about the compiler or the language. There’s a lot going on here!

Compilers 101

Before we talk about Rust specifically, let’s cover some compiler fundamentals. If you’re already familiar with this stuff, feel free to skip ahead to the Rust-specific stuff.

Internally, most modern compilers tend to follow a fairly predictable pattern:

  1. The scanner reads raw program source code and emits a stream of tokens.
  2. A parser takes the stream of tokens from the parser and constructs an intermediate representation of the program by applying grammar rules specific to the language being parsed. The resulting intermediate representation is usually an abstract syntax tree (aka AST), an in-memory data structure representing the parsed program.
  3. At this point semantic analysis occurs, where the compiler traverses the AST to perform various language-specific safety & sanity checks on the parsed program. Statically typed languages typically do their type checking at this point too. Certain optimizations may also be applied to the AST before or after the semantic analysis phase.
  4. The AST itself is then traversed by a code generator to emit the target language. The target language might be native code, or it might be some other language (for example, you might be targeting JavaScript). In the case of languages like C and C++ that emit intermediate object files during the code generation phase may then be linked together to produce a native binary.

You can find these components in most popular programming languages including Python, Ruby and PHP, though the latter omitted the syntax tree & instead blurs the line between parsing & code generation last I checked. I’ll leave it as an exercise to the reader to track these down – let’s move on to the Rust compiler. :)

An overview of the Rust Compiler

At a high level, there are no surprises in the structure of the Rust compiler. In Rust, you will find:

  • A scanner
  • A parser
  • A semantic analysis phase
  • A code generation phase
  • A link step

In fact, these steps are actually pretty easy to identify with a little digging if you track down compile_input in src/librustc/driver/driver.rs compile\_input is called from main_args in src/librustc/lib.rs which is called directly from the rustc entry point in the same file. It’s the function that really drives the compilation process.

Anyway, within compile\_input you’ll find calls to the following functions:

If you peruse these functions a little, you’ll quickly get a feel for how everything fits together. Let’s look at each phase in a little more detail. I’ll focus as much as I can that make the Rust compiler unique from compilers for other languages that I’m familiar with.

phase_1_parse_input

First, you’ll usually hit syntax::parse::parse_crate_from_file in src/libsyntax/parse/mod.rs here. Stepping through a few levels of indirection you’ll come to syntax::parse::file_to_filemap which slurps the source file from disk then calls syntax::parse::string_to_filemap to construct a FileMap which seems to be a glorified container for the file name and source code (plus some extra metadata & helper methods).

The FileMap is transformed into an ast::TokenTree via a quick pass through a lexer built from the source string which in turn is read by syntax::parse::parser::Parser.parse_all_token_trees(). All of this happens in in syntax::parse::filemap_to_tts. Note that although we’re making a pass through the parser we’re not actually building the AST at this point – i.e. the TokenTree is not the AST, though it does appear to be an intermediate representation between raw source code and the AST. (You’ll have to forgive the hand-waving here, I’m not especially familiar with this part of the code!)

Before we look at the AST pass, it’s interesting to note that the pass to construct the TokenTree uses an impl of syntax::lexer::Reader called syntax::lexer::StringReader which emits tokens from a raw string. This is a “traditional” scanner that converts raw source code into tokens. The TokenTree is then used to create a new lexer in syntax::parse::tts_to_parser that emits tokens built from the TokenTree which is used by the AST pass. You can see the TokenTree-based Reader impl here. I find this a little “weird” in that compilers for simpler languages usually go straight from raw source code to tokens to an AST, but from what I can tell this probably helps drive Rust’s syntax extension / macro system.

Anyway, TokenTree weirdness aside, by the end of all the indirection in parse\_crate\_from\_file we’ve got an interface to a fairly traditional-looking scanner in the form of a syntax::lexer::Reader wrapped in a fairly boring syntax::parse::parser::Parser. We call syntax::parse::parser::Parser.parse_crate_mod to parse the token stream from the lexer::Reader and return our “real” AST in the form of a syntax::ast::Crate via what appears to be a fairly standard recursive descent parse.

phase_2_configure_and_expand

From a quick pass through this method:

  1. The crate’s feature attributes are checked. This is where we check the crate attributes for #[feature(…)] attributes specifying gated features required by the crate being compiled. If a requested feature has been removed, the compiler will complain. A similar error will appear if an unknown feature is requested, or if the feature name is malformed.

  2. Standard libraries are injected, unless #[no_std] is specified Typically to link to external crates, you need to refer to the crate using the extern keyword e.g. extern crate pcre;

    Unlike most crates, the std crate will be available to you by default without such an extern declaration. If you specify #[no_std] in your crate attributes, the std crate will be excluded. Similar voodoo is performed on #[no_start] as part of this pass.

  3. There is a pass to remove “unconfigured” items. This is basically applying the special rules implied by conditional compilation requested using #[cfg(foo)]. Note that we haven’t yet expanded macros / syntax extensions.

  4. Syntax extension expansion occurs. Exactly as it sounds, but if you’re unfamiliar with Rust’s facilities for syntax extensions check out the manual.

  5. We remove “unconfigured” items again. Think about it: we just expanded macros, so we may have emitted new items with cfg attributes!

  6. If we’re compiling tests, analyze & emit AST nodes describing a test harness. The details can be found in rustc::front::test::modify_for_testing.

  7. Prelude (std::prelude::*) imports are injected, unless #[no_implicit_prelude] is present at the crate level. The prelude includes stuff that is automatically & implicitly imported into your Rust code. We do that here, unless the #[no_implicit_prelude] attribute is present.

  8. We assign node IDs to AST nodes & build a mapping from those IDs to AST nodes. The AST map is useful for later phases of the compilation process.

Finally, the compiler supports optionally dumping out the full AST after all of these steps have occurred. This is useful for developers hacking on the compiler.

phase_3_run_analysis_passes

This is essentially the semantic analysis phase. At this point we make multiple passes over the AST to perform various different tasks. In most cases you can find the details of the semantic analysis steps in rustc::middle::*.

A lot happens here and trying to cover all the detail isn’t really possible, but fortunately most of the top-level functions speak for themselves:

  1. Resolve external crates AST pass tracking used crates & externs required for later resolve steps. Handled by rustc::metadata::creader::read_crates
  2. Name resolution What does a name represent in a given context? Function? Local variable? Type? Rust has two namespaces: one for types and one for values. This allows you to refer to e.g. both the str type and the str module at the same time.
  3. Resolve lifetimes Name resolution (could be misunderstanding, but this is presumably for lifetime variables e.g. ‘f ?).
  4. Find the program entry point Tries to locate the main function, or indicates no entry point if #[no_main] is set.
  5. Find the macro registrar Not entirely sure what this does, at a glance.
  6. Find freevars Annotates for loops and functions with the freevars within them.
  7. Region resolution Described in more detail here and here.
  8. Perform type checking Documented in a fair bit of detail here
  9. Static, constant and privacy checking
  10. Effect checking. This basically checks that the source program is not trying to do unsafe operations in unsafe contexts (i.e. outside of an ‘unsafe’ block/function). If it discovers a violation, compilation fails.
  11. Loop checking. Make sure the source program isn’t using break and continue outside of a loop, amongst other things.
  12. A compute moves step. This checks whether an operation will result in a move & enforces rules on non-copyable (aka “linear”) types.
  13. Match checking. Ensure that pattern matches are legal & exhaustive. Complain about unreachable patterns, etc.
  14. Liveness checking. Lots of details about this process documented in the source code.
  15. Borrow checking. Enforces Rust’s memory safety rules. Again, lots of details documented in the source code.
  16. Kind checking. Enforces special rules for built-in traits like Send and Drop.
  17. Reachability checking. Brief summary of the details here.
  18. Dead code checking. Warns if the compiler can determine that certain code will never be executed.
  19. Lint checking. Short but sweet docs here.

Phew. I think I missed some parts, but hopefully that’s enough to help you understand the sort of stuff that’s happening here.

phase_4_translate_to_llvm

I consider this our transition from the semantic analysis phase into the code generation phase.

With the amount of stuff that’s going on in phase_3, you might be relieved to know that phase_4 is actually fairly simple & cohesive: essentially this phase calls out to rustc::middle::trans::base::trans_crate to transform Rust’s internal ast::Crate into an llvm::Module. Rust leans pretty heavily on LLVM for its code generation backend.

phase_5_run_llvm_passes

Another relatively simple step, but its name doesn’t really explain what’s really going on here. At this point we’re well into code generation territory: the “LLVM passes” alluded to here are the calls to the LLVM API that emit native code or assembly or LLVM bitcode and write it out – typically to disk.

The code generation happens in rustc::back::link::write::run_passes, with the output formats selected & assembly/bitcode emitted here and object files optionally written to disk here.

The LLVM pass to write to disk is actually wired up via a C++ wrapper function with C linkage called LLVMRustWriteOutputFile which in turn calls llvm::TargetMachine::addPassesToEmitFile. You can see the Rust side of this call in rustc::back::link::WriteOutputFile.

If we generated object files in the previous step & the compiler is configured to produce a native library or executable, the final pass is a link step. Ultimately this occurs in rustc::back::link::link_natively wherein Rust literally goes hunting for your platform’s cc program (the system C compiler) and uses it to invoke the linker to combine the object files together into an executable for the target platform. The arguments to the linker are built up in link_args.

By the time this function returns, compilation is complete & you can run your freshly compiled Rust binary!

What’s next?

To the source code, friend. Track down some issues to work on & get hacking.

If you made it this far – well done! Feel free to send me 140 characters or less of hate via Twitter @tglee or come along to a PDX-Rust meetup sometime.