Consolidate and fix code blocks

This commit is contained in:
varkor 2018-04-05 13:24:08 +01:00 committed by Who? Me?!
parent cacdd73802
commit ad35712ddc
34 changed files with 582 additions and 424 deletions

View File

@ -30,8 +30,8 @@ To help prevent accidentally introducing broken links, we use the
invoke this link checker, otherwise it will emit a warning saying it couldn't invoke this link checker, otherwise it will emit a warning saying it couldn't
be found. be found.
``` ```bash
$ cargo install mdbook-linkcheck > cargo install mdbook-linkcheck
``` ```
You will need `mdbook` version `>= 0.1`. `linkcheck` will be run automatically You will need `mdbook` version `>= 0.1`. `linkcheck` will be run automatically
when you run `mdbook build`. when you run `mdbook build`.

View File

@ -21,7 +21,7 @@ all the remainder. Only at the end of the block is there the
possibility of branching to more than one place (in MIR, we call that possibility of branching to more than one place (in MIR, we call that
final statement the **terminator**): final statement the **terminator**):
``` ```mir
bb0: { bb0: {
statement0; statement0;
statement1; statement1;
@ -34,7 +34,7 @@ bb0: {
Many expressions that you are used to in Rust compile down to multiple Many expressions that you are used to in Rust compile down to multiple
basic blocks. For example, consider an if statement: basic blocks. For example, consider an if statement:
```rust ```rust,ignore
a = 1; a = 1;
if some_variable { if some_variable {
b = 1; b = 1;
@ -46,7 +46,7 @@ d = 1;
This would compile into four basic blocks: This would compile into four basic blocks:
``` ```mir
BB0: { BB0: {
a = 1; a = 1;
if some_variable { goto BB1 } else { goto BB2 } if some_variable { goto BB1 } else { goto BB2 }

View File

@ -3,7 +3,7 @@
> **Note:** This is a copy of `@nrc`'s amazing [stupid-stats]. You should find > **Note:** This is a copy of `@nrc`'s amazing [stupid-stats]. You should find
> a copy of the code on the GitHub repository although due to the compiler's > a copy of the code on the GitHub repository although due to the compiler's
> constantly evolving nature, there is no guarantee it'll compile on the first > constantly evolving nature, there is no guarantee it'll compile on the first
> go. > go.
Many tools benefit from being a drop-in replacement for a compiler. By this, I Many tools benefit from being a drop-in replacement for a compiler. By this, I
mean that any user of the tool can use `mytool` in all the ways they would mean that any user of the tool can use `mytool` in all the ways they would
@ -177,7 +177,7 @@ foo.rs` (assuming you have a Rust program called `foo.rs`. You can also pass any
command line arguments that you would normally pass to rustc). When you run it command line arguments that you would normally pass to rustc). When you run it
you'll see output similar to you'll see output similar to
``` ```txt
In crate: foo, In crate: foo,
Found 12 uses of `println!`; Found 12 uses of `println!`;
@ -205,7 +205,7 @@ should dump stupid-stats' stdout to Cargo's stdout).
Let's start with the `main` function for our tool, it is pretty simple: Let's start with the `main` function for our tool, it is pretty simple:
``` ```rust,ignore
fn main() { fn main() {
let args: Vec<_> = std::env::args().collect(); let args: Vec<_> = std::env::args().collect();
rustc_driver::run_compiler(&args, &mut StupidCalls::new()); rustc_driver::run_compiler(&args, &mut StupidCalls::new());
@ -223,7 +223,7 @@ this tool different from rustc.
`StupidCalls` is a mostly empty struct: `StupidCalls` is a mostly empty struct:
``` ```rust,ignore
struct StupidCalls { struct StupidCalls {
default_calls: RustcDefaultCalls, default_calls: RustcDefaultCalls,
} }
@ -238,7 +238,7 @@ to keep Cargo happy.
Most of the rest of the impl of `CompilerCalls` is trivial: Most of the rest of the impl of `CompilerCalls` is trivial:
``` ```rust,ignore
impl<'a> CompilerCalls<'a> for StupidCalls { impl<'a> CompilerCalls<'a> for StupidCalls {
fn early_callback(&mut self, fn early_callback(&mut self,
_: &getopts::Matches, _: &getopts::Matches,
@ -300,7 +300,7 @@ tool does it's actual work by walking the AST. We do that by creating an AST
visitor and making it walk the AST from the top (the crate root). Once we've visitor and making it walk the AST from the top (the crate root). Once we've
walked the crate, we print the stats we've collected: walked the crate, we print the stats we've collected:
``` ```rust,ignore
fn build_controller(&mut self, _: &Session) -> driver::CompileController<'a> { fn build_controller(&mut self, _: &Session) -> driver::CompileController<'a> {
// We mostly want to do what rustc does, which is what basic() will return. // We mostly want to do what rustc does, which is what basic() will return.
let mut control = driver::CompileController::basic(); let mut control = driver::CompileController::basic();
@ -340,7 +340,7 @@ That is all it takes to create your own drop-in compiler replacement or custom
compiler! For the sake of completeness I'll go over the rest of the stupid-stats compiler! For the sake of completeness I'll go over the rest of the stupid-stats
tool. tool.
``` ```rust
struct StupidVisitor { struct StupidVisitor {
println_count: usize, println_count: usize,
arg_counts: Vec<usize>, arg_counts: Vec<usize>,
@ -355,7 +355,7 @@ methods, these walk the AST taking no action. We override `visit_item` and
functions, modules, traits, structs, and so forth, we're only interested in functions, modules, traits, structs, and so forth, we're only interested in
functions) and macros: functions) and macros:
``` ```rust,ignore
impl<'v> visit::Visitor<'v> for StupidVisitor { impl<'v> visit::Visitor<'v> for StupidVisitor {
fn visit_item(&mut self, i: &'v ast::Item) { fn visit_item(&mut self, i: &'v ast::Item) {
match i.node { match i.node {

View File

@ -61,7 +61,8 @@ which takes a single argument (which, in this case is a value of 1).
(rather than the current Rust default of 101 at the time of this writing). The (rather than the current Rust default of 101 at the time of this writing). The
header command and the argument list (if present) are typically separated by a header command and the argument list (if present) are typically separated by a
colon: colon:
```
```rust,ignore
// Copyright 2018 The Rust Project Developers. See the COPYRIGHT // Copyright 2018 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at // file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT. // http://rust-lang.org/COPYRIGHT.

View File

@ -21,7 +21,7 @@ tidy script runs automatically when you do `./x.py test`.
All files must begin with the following copyright notice: All files must begin with the following copyright notice:
``` ```rust
// Copyright 2012-2013 The Rust Project Developers. See the COPYRIGHT // Copyright 2012-2013 The Rust Project Developers. See the COPYRIGHT
// file at the top-level directory of this distribution and at // file at the top-level directory of this distribution and at
// http://rust-lang.org/COPYRIGHT. // http://rust-lang.org/COPYRIGHT.
@ -48,7 +48,7 @@ tests -- it can be necessary to exempt yourself from this limit. In
that case, you can add a comment towards the top of the file (after that case, you can add a comment towards the top of the file (after
the copyright notice) like so: the copyright notice) like so:
``` ```rust
// ignore-tidy-linelength // ignore-tidy-linelength
``` ```
@ -61,7 +61,7 @@ Prefer 4-space indent.
# Coding for correctness # Coding for correctness
Beyond formatting, there are a few other tips that are worth Beyond formatting, there are a few other tips that are worth
following. following.
## Prefer exhaustive matches ## Prefer exhaustive matches

View File

@ -19,7 +19,7 @@ compilation improves, that may change.)
The dependency structure of these crates is roughly a diamond: The dependency structure of these crates is roughly a diamond:
``` ```txt
rustc_driver rustc_driver
/ | \ / | \
/ | \ / | \

View File

@ -12,8 +12,8 @@ This chapter covers the main concepts of the HIR.
You can view the HIR representation of your code by passing the You can view the HIR representation of your code by passing the
`-Zunpretty=hir-tree` flag to rustc: `-Zunpretty=hir-tree` flag to rustc:
``` ```bash
cargo rustc -- -Zunpretty=hir-tree > cargo rustc -- -Zunpretty=hir-tree
``` ```
### Out-of-band storage and the `Crate` type ### Out-of-band storage and the `Crate` type

View File

@ -70,8 +70,8 @@ Once you've created a config.toml, you are now ready to run
`x.py`. There are a lot of options here, but let's start with what is `x.py`. There are a lot of options here, but let's start with what is
probably the best "go to" command for building a local rust: probably the best "go to" command for building a local rust:
``` ```bash
./x.py build -i --stage 1 src/libstd > ./x.py build -i --stage 1 src/libstd
``` ```
What this command will do is the following: What this command will do is the following:
@ -106,7 +106,7 @@ will execute the stage2 compiler (which we did not build, but which
you will likely need to build at some point; for example, if you want you will likely need to build at some point; for example, if you want
to run the entire test suite). to run the entire test suite).
``` ```bash
> rustup toolchain link stage1 build/<host-triple>/stage1 > rustup toolchain link stage1 build/<host-triple>/stage1
> rustup toolchain link stage2 build/<host-triple>/stage2 > rustup toolchain link stage2 build/<host-triple>/stage2
``` ```
@ -115,7 +115,7 @@ Now you can run the rustc you built with. If you run with `-vV`, you
should see a version number ending in `-dev`, indicating a build from should see a version number ending in `-dev`, indicating a build from
your local environment: your local environment:
``` ```bash
> rustc +stage1 -vV > rustc +stage1 -vV
rustc 1.25.0-dev rustc 1.25.0-dev
binary: rustc binary: rustc

View File

@ -10,7 +10,7 @@ As an example, see `src/test/compile-fail/dep-graph-caller-callee.rs`.
The idea is that you can annotate a test like: The idea is that you can annotate a test like:
```rust ```rust,ignore
#[rustc_if_this_changed] #[rustc_if_this_changed]
fn foo() { } fn foo() { }
@ -48,7 +48,7 @@ the graph. You can filter in three ways:
To filter, use the `RUST_DEP_GRAPH_FILTER` environment variable, which should To filter, use the `RUST_DEP_GRAPH_FILTER` environment variable, which should
look like one of the following: look like one of the following:
``` ```txt
source_filter // nodes originating from source_filter source_filter // nodes originating from source_filter
-> target_filter // nodes that can reach target_filter -> target_filter // nodes that can reach target_filter
source_filter -> target_filter // nodes in between source_filter and target_filter source_filter -> target_filter // nodes in between source_filter and target_filter
@ -58,14 +58,14 @@ source_filter -> target_filter // nodes in between source_filter and target_filt
A node is considered to match a filter if all of those strings appear in its A node is considered to match a filter if all of those strings appear in its
label. So, for example: label. So, for example:
``` ```txt
RUST_DEP_GRAPH_FILTER='-> TypeckTables' RUST_DEP_GRAPH_FILTER='-> TypeckTables'
``` ```
would select the predecessors of all `TypeckTables` nodes. Usually though you would select the predecessors of all `TypeckTables` nodes. Usually though you
want the `TypeckTables` node for some particular fn, so you might write: want the `TypeckTables` node for some particular fn, so you might write:
``` ```txt
RUST_DEP_GRAPH_FILTER='-> TypeckTables & bar' RUST_DEP_GRAPH_FILTER='-> TypeckTables & bar'
``` ```
@ -75,7 +75,7 @@ with `bar` in their name.
Perhaps you are finding that when you change `foo` you need to re-type-check Perhaps you are finding that when you change `foo` you need to re-type-check
`bar`, but you don't think you should have to. In that case, you might do: `bar`, but you don't think you should have to. In that case, you might do:
``` ```txt
RUST_DEP_GRAPH_FILTER='Hir & foo -> TypeckTables & bar' RUST_DEP_GRAPH_FILTER='Hir & foo -> TypeckTables & bar'
``` ```
@ -105,8 +105,10 @@ check of `bar` and you don't think there should be. You dump the
dep-graph as described in the previous section and open `dep-graph.txt` dep-graph as described in the previous section and open `dep-graph.txt`
to see something like: to see something like:
Hir(foo) -> Collect(bar) ```txt
Collect(bar) -> TypeckTables(bar) Hir(foo) -> Collect(bar)
Collect(bar) -> TypeckTables(bar)
```
That first edge looks suspicious to you. So you set That first edge looks suspicious to you. So you set
`RUST_FORBID_DEP_GRAPH_EDGE` to `Hir&foo -> Collect&bar`, re-run, and `RUST_FORBID_DEP_GRAPH_EDGE` to `Hir&foo -> Collect&bar`, re-run, and

View File

@ -15,7 +15,7 @@ expansion works.
It's helpful to have an example to refer to. For the remainder of this chapter, It's helpful to have an example to refer to. For the remainder of this chapter,
whenever we refer to the "example _definition_", we mean the following: whenever we refer to the "example _definition_", we mean the following:
```rust ```rust,ignore
macro_rules! printer { macro_rules! printer {
(print $mvar:ident) => { (print $mvar:ident) => {
println!("{}", $mvar); println!("{}", $mvar);
@ -45,7 +45,7 @@ worrying about _where_. For more information about tokens, see the
Whenever we refer to the "example _invocation_", we mean the following snippet: Whenever we refer to the "example _invocation_", we mean the following snippet:
```rust ```rust,ignore
printer!(print foo); // Assume `foo` is a variable defined somewhere else... printer!(print foo); // Assume `foo` is a variable defined somewhere else...
``` ```
@ -65,7 +65,7 @@ defined in [`src/libsyntax/ext/tt/macro_parser.rs`][code_mp].
The interface of the macro parser is as follows (this is slightly simplified): The interface of the macro parser is as follows (this is slightly simplified):
```rust ```rust,ignore
fn parse( fn parse(
sess: ParserSession, sess: ParserSession,
tts: TokenStream, tts: TokenStream,

View File

@ -8,13 +8,13 @@ the code itself, naturally.
One way to think of method lookup is that we convert an expression of One way to think of method lookup is that we convert an expression of
the form: the form:
```rust ```rust,ignore
receiver.method(...) receiver.method(...)
``` ```
into a more explicit [UFCS] form: into a more explicit [UFCS] form:
```rust ```rust,ignore
Trait::method(ADJ(receiver), ...) // for a trait call Trait::method(ADJ(receiver), ...) // for a trait call
ReceiverType::method(ADJ(receiver), ...) // for an inherent method call ReceiverType::method(ADJ(receiver), ...) // for an inherent method call
``` ```
@ -24,7 +24,7 @@ autoderefs and then possibly an autoref (e.g., `&**receiver`). However
we sometimes do other adjustments and coercions along the way, in we sometimes do other adjustments and coercions along the way, in
particular unsizing (e.g., converting from `[T; n]` to `[T]`). particular unsizing (e.g., converting from `[T; n]` to `[T]`).
Method lookup is divided into two major phases: Method lookup is divided into two major phases:
1. Probing ([`probe.rs`][probe]). The probe phase is when we decide what method 1. Probing ([`probe.rs`][probe]). The probe phase is when we decide what method
to call and how to adjust the receiver. to call and how to adjust the receiver.
@ -51,7 +51,7 @@ until it cannot be deref'd anymore, as well as applying an optional
"unsize" step. So if the receiver has type `Rc<Box<[T; 3]>>`, this "unsize" step. So if the receiver has type `Rc<Box<[T; 3]>>`, this
might yield: might yield:
```rust ```rust,ignore
Rc<Box<[T; 3]>> Rc<Box<[T; 3]>>
Box<[T; 3]> Box<[T; 3]>
[T; 3] [T; 3]
@ -99,9 +99,10 @@ So, let's continue our example. Imagine that we were calling a method
that defines it with `&self` for the type `Rc<U>` as well as a method that defines it with `&self` for the type `Rc<U>` as well as a method
on the type `Box` that defines `Foo` but with `&mut self`. Then we on the type `Box` that defines `Foo` but with `&mut self`. Then we
might have two candidates: might have two candidates:
```txt
&Rc<Box<[T; 3]>> from the impl of `Foo` for `Rc<U>` where `U=Box<T; 3]> &Rc<Box<[T; 3]>> from the impl of `Foo` for `Rc<U>` where `U=Box<T; 3]>
&mut Box<[T; 3]>> from the inherent impl on `Box<U>` where `U=[T; 3]` &mut Box<[T; 3]>> from the inherent impl on `Box<U>` where `U=[T; 3]`
```
### Candidate search ### Candidate search

View File

@ -52,13 +52,13 @@ fn main() {
The files have names like `rustc.main.000-000.CleanEndRegions.after.mir`. These The files have names like `rustc.main.000-000.CleanEndRegions.after.mir`. These
names have a number of parts: names have a number of parts:
``` ```txt
rustc.main.000-000.CleanEndRegions.after.mir rustc.main.000-000.CleanEndRegions.after.mir
---- --- --- --------------- ----- either before or after ---- --- --- --------------- ----- either before or after
| | | name of the pass | | | name of the pass
| | index of dump within the pass (usually 0, but some passes dump intermediate states) | | index of dump within the pass (usually 0, but some passes dump intermediate states)
| index of the pass | index of the pass
def-path to the function etc being dumped def-path to the function etc being dumped
``` ```
You can also make more selective filters. For example, `main & CleanEndRegions` You can also make more selective filters. For example, `main & CleanEndRegions`
@ -159,7 +159,7 @@ ensuring that the reads have already happened (remember that
[queries are memoized](./query.html), so executing a query twice [queries are memoized](./query.html), so executing a query twice
simply loads from a cache the second time): simply loads from a cache the second time):
``` ```txt
mir_const(D) --read-by--> mir_const_qualif(D) mir_const(D) --read-by--> mir_const_qualif(D)
| ^ | ^
stolen-by | stolen-by |

View File

@ -34,7 +34,7 @@ The MIR-based region analysis consists of two major functions:
are used. are used.
- More details to come, though the [NLL RFC] also includes fairly thorough - More details to come, though the [NLL RFC] also includes fairly thorough
(and hopefully readable) coverage. (and hopefully readable) coverage.
[fvb]: appendix-background.html#free-vs-bound [fvb]: appendix-background.html#free-vs-bound
[NLL RFC]: http://rust-lang.github.io/rfcs/2094-nll.html [NLL RFC]: http://rust-lang.github.io/rfcs/2094-nll.html
@ -82,7 +82,7 @@ The kinds of region elements are as follows:
corresponds (intuitively) to some unknown set of other elements -- corresponds (intuitively) to some unknown set of other elements --
for details on skolemization, see the section for details on skolemization, see the section
[skolemization and universes](#skol). [skolemization and universes](#skol).
## Causal tracking ## Causal tracking
*to be written* -- describe how we can extend the values of a variable *to be written* -- describe how we can extend the values of a variable
@ -97,7 +97,7 @@ The kinds of region elements are as follows:
From time to time we have to reason about regions that we can't From time to time we have to reason about regions that we can't
concretely know. For example, consider this program: concretely know. For example, consider this program:
```rust ```rust,ignore
// A function that needs a static reference // A function that needs a static reference
fn foo(x: &'static u32) { } fn foo(x: &'static u32) { }
@ -122,10 +122,12 @@ stack, for example). But *how* do we reject it and *why*?
When we type-check `main`, and in particular the call `bar(foo)`, we When we type-check `main`, and in particular the call `bar(foo)`, we
are going to wind up with a subtyping relationship like this one: are going to wind up with a subtyping relationship like this one:
fn(&'static u32) <: for<'a> fn(&'a u32) ```txt
---------------- ------------------- fn(&'static u32) <: for<'a> fn(&'a u32)
the type of `foo` the type `bar` expects ---------------- -------------------
the type of `foo` the type `bar` expects
```
We handle this sort of subtyping by taking the variables that are We handle this sort of subtyping by taking the variables that are
bound in the supertype and **skolemizing** them: this means that we bound in the supertype and **skolemizing** them: this means that we
replace them with replace them with
@ -135,8 +137,10 @@ regions" -- they represent, basically, "some unknown region".
Once we've done that replacement, we have the following relation: Once we've done that replacement, we have the following relation:
fn(&'static u32) <: fn(&'!1 u32) ```txt
fn(&'static u32) <: fn(&'!1 u32)
```
The key idea here is that this unknown region `'!1` is not related to The key idea here is that this unknown region `'!1` is not related to
any other regions. So if we can prove that the subtyping relationship any other regions. So if we can prove that the subtyping relationship
is true for `'!1`, then it ought to be true for any region, which is is true for `'!1`, then it ought to be true for any region, which is
@ -147,7 +151,9 @@ subtypes, we check if their arguments have the desired relationship
(fn arguments are [contravariant](./appendix-background.html#variance), so (fn arguments are [contravariant](./appendix-background.html#variance), so
we swap the left and right here): we swap the left and right here):
&'!1 u32 <: &'static u32 ```txt
&'!1 u32 <: &'static u32
```
According to the basic subtyping rules for a reference, this will be According to the basic subtyping rules for a reference, this will be
true if `'!1: 'static`. That is -- if "some unknown region `!1`" lives true if `'!1: 'static`. That is -- if "some unknown region `!1`" lives
@ -168,7 +174,7 @@ put generic type parameters into this root universe (in this sense,
there is not just one root universe, but one per item). So consider there is not just one root universe, but one per item). So consider
this function `bar`: this function `bar`:
```rust ```rust,ignore
struct Foo { } struct Foo { }
fn bar<'a, T>(t: &'a T) { fn bar<'a, T>(t: &'a T) {
@ -185,7 +191,7 @@ Basically, the root universe contains all the names that
Now let's extend `bar` a bit by adding a variable `x`: Now let's extend `bar` a bit by adding a variable `x`:
```rust ```rust,ignore
fn bar<'a, T>(t: &'a T) { fn bar<'a, T>(t: &'a T) {
let x: for<'b> fn(&'b u32) = ...; let x: for<'b> fn(&'b u32) = ...;
} }
@ -195,7 +201,7 @@ Here, the name `'b` is not part of the root universe. Instead, when we
"enter" into this `for<'b>` (e.g., by skolemizing it), we will create "enter" into this `for<'b>` (e.g., by skolemizing it), we will create
a child universe of the root, let's call it U1: a child universe of the root, let's call it U1:
``` ```txt
U0 (root universe) U0 (root universe)
└─ U1 (child universe) └─ U1 (child universe)
@ -207,7 +213,7 @@ with a new name, which we are identifying by its universe number:
Now let's extend `bar` a bit by adding one more variable, `y`: Now let's extend `bar` a bit by adding one more variable, `y`:
```rust ```rust,ignore
fn bar<'a, T>(t: &'a T) { fn bar<'a, T>(t: &'a T) {
let x: for<'b> fn(&'b u32) = ...; let x: for<'b> fn(&'b u32) = ...;
let y: for<'c> fn(&'b u32) = ...; let y: for<'c> fn(&'b u32) = ...;
@ -218,7 +224,7 @@ When we enter *this* type, we will again create a new universe, which
we'll call `U2`. Its parent will be the root universe, and U1 will be we'll call `U2`. Its parent will be the root universe, and U1 will be
its sibling: its sibling:
``` ```txt
U0 (root universe) U0 (root universe)
├─ U1 (child universe) ├─ U1 (child universe)
@ -257,11 +263,11 @@ children, that inference variable X would have to be in U0. And since
X is in U0, it cannot name anything from U1 (or U2). This is perhaps easiest X is in U0, it cannot name anything from U1 (or U2). This is perhaps easiest
to see by using a kind of generic "logic" example: to see by using a kind of generic "logic" example:
``` ```txt
exists<X> { exists<X> {
forall<Y> { ... /* Y is in U1 ... */ } forall<Y> { ... /* Y is in U1 ... */ }
forall<Z> { ... /* Z is in U2 ... */ } forall<Z> { ... /* Z is in U2 ... */ }
} }
``` ```
Here, the only way for the two foralls to interact would be through X, Here, the only way for the two foralls to interact would be through X,
@ -290,8 +296,10 @@ does not say region elements **will** appear.
In the region inference engine, outlives constraints have the form: In the region inference engine, outlives constraints have the form:
V1: V2 @ P ```txt
V1: V2 @ P
```
where `V1` and `V2` are region indices, and hence map to some region where `V1` and `V2` are region indices, and hence map to some region
variable (which may be universally or existentially quantified). The variable (which may be universally or existentially quantified). The
`P` here is a "point" in the control-flow graph; it's not important `P` here is a "point" in the control-flow graph; it's not important
@ -338,8 +346,10 @@ for universal regions from the fn signature.)
Put another way, the "universal regions" check can be considered to be Put another way, the "universal regions" check can be considered to be
checking constraints like: checking constraints like:
{skol(1)}: V1 ```txt
{skol(1)}: V1
```
where `{skol(1)}` is like a constant set, and V1 is the variable we where `{skol(1)}` is like a constant set, and V1 is the variable we
made to represent the `!1` region. made to represent the `!1` region.
@ -348,30 +358,40 @@ made to represent the `!1` region.
OK, so far so good. Now let's walk through what would happen with our OK, so far so good. Now let's walk through what would happen with our
first example: first example:
fn(&'static u32) <: fn(&'!1 u32) @ P // this point P is not imp't here ```txt
fn(&'static u32) <: fn(&'!1 u32) @ P // this point P is not imp't here
```
The region inference engine will create a region element domain like this: The region inference engine will create a region element domain like this:
{ CFG; end('static); skol(1) } ```txt
--- ------------ ------- from the universe `!1` { CFG; end('static); skol(1) }
| 'static is always in scope --- ------------ ------- from the universe `!1`
all points in the CFG; not especially relevant here | 'static is always in scope
all points in the CFG; not especially relevant here
```
It will always create two universal variables, one representing It will always create two universal variables, one representing
`'static` and one representing `'!1`. Let's call them Vs and V1. They `'static` and one representing `'!1`. Let's call them Vs and V1. They
will have initial values like so: will have initial values like so:
Vs = { CFG; end('static) } // it is in U0, so can't name anything else ```txt
V1 = { skol(1) } Vs = { CFG; end('static) } // it is in U0, so can't name anything else
V1 = { skol(1) }
```
From the subtyping constraint above, we would have an outlives constraint like From the subtyping constraint above, we would have an outlives constraint like
'!1: 'static @ P ```txt
'!1: 'static @ P
```
To process this, we would grow the value of V1 to include all of Vs: To process this, we would grow the value of V1 to include all of Vs:
Vs = { CFG; end('static) } ```txt
V1 = { CFG; end('static), skol(1) } Vs = { CFG; end('static) }
V1 = { CFG; end('static), skol(1) }
```
At that point, constraint propagation is complete, because all the At that point, constraint propagation is complete, because all the
outlives relationships are satisfied. Then we would go to the "check outlives relationships are satisfied. Then we would go to the "check
@ -385,34 +405,44 @@ In this case, `V1` *did* grow too large -- it is not known to outlive
What about this subtyping relationship? What about this subtyping relationship?
for<'a> fn(&'a u32, &'a u32) ```txt
<: for<'a> fn(&'a u32, &'a u32)
for<'b, 'c> fn(&'b u32, &'c u32) <:
for<'b, 'c> fn(&'b u32, &'c u32)
Here we would skolemize the supertype, as before, yielding: ```
Here we would skolemize the supertype, as before, yielding:
```txt
for<'a> fn(&'a u32, &'a u32)
<:
fn(&'!1 u32, &'!2 u32)
```
for<'a> fn(&'a u32, &'a u32)
<:
fn(&'!1 u32, &'!2 u32)
then we instantiate the variable on the left-hand side with an then we instantiate the variable on the left-hand side with an
existential in universe U2, yielding the following (`?n` is a notation existential in universe U2, yielding the following (`?n` is a notation
for an existential variable): for an existential variable):
fn(&'?3 u32, &'?3 u32) ```txt
<: fn(&'?3 u32, &'?3 u32)
fn(&'!1 u32, &'!2 u32) <:
fn(&'!1 u32, &'!2 u32)
```
Then we break this down further: Then we break this down further:
&'!1 u32 <: &'?3 u32 ```txt
&'!2 u32 <: &'?3 u32 &'!1 u32 <: &'?3 u32
&'!2 u32 <: &'?3 u32
```
and even further, yield up our region constraints: and even further, yield up our region constraints:
'!1: '?3 ```txt
'!2: '?3 '!1: '?3
'!2: '?3
```
Note that, in this case, both `'!1` and `'!2` have to outlive the Note that, in this case, both `'!1` and `'!2` have to outlive the
variable `'?3`, but the variable `'?3` is not forced to outlive variable `'?3`, but the variable `'?3` is not forced to outlive
anything else. Therefore, it simply starts and ends as the empty set anything else. Therefore, it simply starts and ends as the empty set
@ -430,15 +460,17 @@ common lifetime of our arguments. -nmatsakis)
[ohdeargoditsallbroken]: https://github.com/rust-lang/rust/issues/32330#issuecomment-202536977 [ohdeargoditsallbroken]: https://github.com/rust-lang/rust/issues/32330#issuecomment-202536977
## Final example ## Final example
Let's look at one last example. We'll extend the previous one to have Let's look at one last example. We'll extend the previous one to have
a return type: a return type:
for<'a> fn(&'a u32, &'a u32) -> &'a u32 ```txt
<: for<'a> fn(&'a u32, &'a u32) -> &'a u32
for<'b, 'c> fn(&'b u32, &'c u32) -> &'b u32 <:
for<'b, 'c> fn(&'b u32, &'c u32) -> &'b u32
```
Despite seeming very similar to the previous example, this case is going to get Despite seeming very similar to the previous example, this case is going to get
an error. That's good: the problem is that we've gone from a fn that promises an error. That's good: the problem is that we've gone from a fn that promises
to return one of its two arguments, to a fn that is promising to return the to return one of its two arguments, to a fn that is promising to return the
@ -446,45 +478,59 @@ first one. That is unsound. Let's see how it plays out.
First, we skolemize the supertype: First, we skolemize the supertype:
for<'a> fn(&'a u32, &'a u32) -> &'a u32 ```txt
<: for<'a> fn(&'a u32, &'a u32) -> &'a u32
fn(&'!1 u32, &'!2 u32) -> &'!1 u32 <:
fn(&'!1 u32, &'!2 u32) -> &'!1 u32
```
Then we instantiate the subtype with existentials (in U2): Then we instantiate the subtype with existentials (in U2):
fn(&'?3 u32, &'?3 u32) -> &'?3 u32 ```txt
<: fn(&'?3 u32, &'?3 u32) -> &'?3 u32
fn(&'!1 u32, &'!2 u32) -> &'!1 u32 <:
fn(&'!1 u32, &'!2 u32) -> &'!1 u32
```
And now we create the subtyping relationships: And now we create the subtyping relationships:
&'!1 u32 <: &'?3 u32 // arg 1 ```txt
&'!2 u32 <: &'?3 u32 // arg 2 &'!1 u32 <: &'?3 u32 // arg 1
&'?3 u32 <: &'!1 u32 // return type &'!2 u32 <: &'?3 u32 // arg 2
&'?3 u32 <: &'!1 u32 // return type
```
And finally the outlives relationships. Here, let V1, V2, and V3 be the And finally the outlives relationships. Here, let V1, V2, and V3 be the
variables we assign to `!1`, `!2`, and `?3` respectively: variables we assign to `!1`, `!2`, and `?3` respectively:
V1: V3 ```txt
V2: V3 V1: V3
V3: V1 V2: V3
V3: V1
```
Those variables will have these initial values: Those variables will have these initial values:
V1 in U1 = {skol(1)} ```txt
V2 in U2 = {skol(2)} V1 in U1 = {skol(1)}
V3 in U2 = {} V2 in U2 = {skol(2)}
V3 in U2 = {}
```
Now because of the `V3: V1` constraint, we have to add `skol(1)` into `V3` (and Now because of the `V3: V1` constraint, we have to add `skol(1)` into `V3` (and
indeed it is visible from `V3`), so we get: indeed it is visible from `V3`), so we get:
V3 in U2 = {skol(1)} ```txt
V3 in U2 = {skol(1)}
```
then we have this constraint `V2: V3`, so we wind up having to enlarge then we have this constraint `V2: V3`, so we wind up having to enlarge
`V2` to include `skol(1)` (which it can also see): `V2` to include `skol(1)` (which it can also see):
V2 in U2 = {skol(1), skol(2)} ```txt
V2 in U2 = {skol(1), skol(2)}
```
Now contraint propagation is done, but when we check the outlives Now contraint propagation is done, but when we check the outlives
relationships, we find that `V2` includes this new element `skol(1)`, relationships, we find that `V2` includes this new element `skol(1)`,
so we report an error. so we report an error.

View File

@ -13,7 +13,7 @@ To implement a visitor, you have to create a type that represents
your visitor. Typically, this type wants to "hang on" to whatever your visitor. Typically, this type wants to "hang on" to whatever
state you will need while processing MIR: state you will need while processing MIR:
```rust ```rust,ignore
struct MyVisitor<...> { struct MyVisitor<...> {
tcx: TyCtxt<'cx, 'tcx, 'tcx>, tcx: TyCtxt<'cx, 'tcx, 'tcx>,
... ...
@ -22,10 +22,10 @@ struct MyVisitor<...> {
and you then implement the `Visitor` or `MutVisitor` trait for that type: and you then implement the `Visitor` or `MutVisitor` trait for that type:
```rust ```rust,ignore
impl<'tcx> MutVisitor<'tcx> for NoLandingPads { impl<'tcx> MutVisitor<'tcx> for NoLandingPads {
fn visit_foo(&mut self, ...) { fn visit_foo(&mut self, ...) {
// ... ...
self.super_foo(...); self.super_foo(...);
} }
} }

View File

@ -69,12 +69,12 @@ fn main() {
You should see something like: You should see something like:
``` ```mir
// WARNING: This output format is intended for human consumers only // WARNING: This output format is intended for human consumers only
// and is subject to change without notice. Knock yourself out. // and is subject to change without notice. Knock yourself out.
fn main() -> () { fn main() -> () {
... ...
} }
``` ```
This is the MIR format for the `main` function. This is the MIR format for the `main` function.
@ -82,7 +82,7 @@ This is the MIR format for the `main` function.
**Variable declarations.** If we drill in a bit, we'll see it begins **Variable declarations.** If we drill in a bit, we'll see it begins
with a bunch of variable declarations. They look like this: with a bunch of variable declarations. They look like this:
``` ```mir
let mut _0: (); // return place let mut _0: (); // return place
scope 1 { scope 1 {
let mut _1: std::vec::Vec<i32>; // "vec" in scope 1 at src/main.rs:2:9: 2:16 let mut _1: std::vec::Vec<i32>; // "vec" in scope 1 at src/main.rs:2:9: 2:16
@ -107,8 +107,8 @@ program (which names were in scope when).
it may look slightly different when you view it, and I am ignoring some of the it may look slightly different when you view it, and I am ignoring some of the
comments): comments):
``` ```mir
bb0: { bb0: {
StorageLive(_1); StorageLive(_1);
_1 = const <std::vec::Vec<T>>::new() -> bb2; _1 = const <std::vec::Vec<T>>::new() -> bb2;
} }
@ -117,7 +117,7 @@ bb0: {
A basic block is defined by a series of **statements** and a final A basic block is defined by a series of **statements** and a final
**terminator**. In this case, there is one statement: **terminator**. In this case, there is one statement:
``` ```mir
StorageLive(_1); StorageLive(_1);
``` ```
@ -129,7 +129,7 @@ allocate stack space.
The **terminator** of the block `bb0` is the call to `Vec::new`: The **terminator** of the block `bb0` is the call to `Vec::new`:
``` ```mir
_1 = const <std::vec::Vec<T>>::new() -> bb2; _1 = const <std::vec::Vec<T>>::new() -> bb2;
``` ```
@ -142,8 +142,8 @@ possible, and hence we list only one succssor block, `bb2`.
If we look ahead to `bb2`, we will see it looks like this: If we look ahead to `bb2`, we will see it looks like this:
``` ```mir
bb2: { bb2: {
StorageLive(_3); StorageLive(_3);
_3 = &mut _1; _3 = &mut _1;
_2 = const <std::vec::Vec<T>>::push(move _3, const 1i32) -> [return: bb3, unwind: bb4]; _2 = const <std::vec::Vec<T>>::push(move _3, const 1i32) -> [return: bb3, unwind: bb4];
@ -153,13 +153,13 @@ bb2: {
Here there are two statements: another `StorageLive`, introducing the `_3` Here there are two statements: another `StorageLive`, introducing the `_3`
temporary, and then an assignment: temporary, and then an assignment:
``` ```mir
_3 = &mut _1; _3 = &mut _1;
``` ```
Assignments in general have the form: Assignments in general have the form:
``` ```txt
<Place> = <Rvalue> <Place> = <Rvalue>
``` ```
@ -169,7 +169,7 @@ value: in this case, the rvalue is a mutable borrow expression, which
looks like `&mut <Place>`. So we can kind of define a grammar for looks like `&mut <Place>`. So we can kind of define a grammar for
rvalues like so: rvalues like so:
``` ```txt
<Rvalue> = & (mut)? <Place> <Rvalue> = & (mut)? <Place>
| <Operand> + <Operand> | <Operand> + <Operand>
| <Operand> - <Operand> | <Operand> - <Operand>
@ -178,7 +178,7 @@ rvalues like so:
<Operand> = Constant <Operand> = Constant
| copy Place | copy Place
| move Place | move Place
``` ```
As you can see from this grammar, rvalues cannot be nested -- they can As you can see from this grammar, rvalues cannot be nested -- they can
only reference places and constants. Moreover, when you use a place, only reference places and constants. Moreover, when you use a place,
@ -188,7 +188,7 @@ for a place of any type). So, for example, if we had the expression `x
= a + b + c` in Rust, that would get compile to two statements and a = a + b + c` in Rust, that would get compile to two statements and a
temporary: temporary:
``` ```mir
TMP1 = a + b TMP1 = a + b
x = TMP1 + c x = TMP1 + c
``` ```
@ -214,14 +214,14 @@ but [you can read about those below](#promoted)).
we pass around `BasicBlock` values, which are we pass around `BasicBlock` values, which are
[newtype'd] indices into this vector. [newtype'd] indices into this vector.
- **Statements** are represented by the type `Statement`. - **Statements** are represented by the type `Statement`.
- **Terminators** are represented by the `Terminator`. - **Terminators** are represented by the `Terminator`.
- **Locals** are represented by a [newtype'd] index type `Local`. The - **Locals** are represented by a [newtype'd] index type `Local`. The
data for a local variable is found in the `Mir` (the `local_decls` data for a local variable is found in the `Mir` (the `local_decls`
vector). There is also a special constant `RETURN_PLACE` identifying vector). There is also a special constant `RETURN_PLACE` identifying
the special "local" representing the return value. the special "local" representing the return value.
- **Places** are identified by the enum `Place`. There are a few variants: - **Places** are identified by the enum `Place`. There are a few variants:
- Local variables like `_1` - Local variables like `_1`
- Static variables `FOO` - Static variables `FOO`
- **Projections**, which are fields or other things that "project - **Projections**, which are fields or other things that "project
out" from a base place. So e.g. the place `_1.f` is a projection, out" from a base place. So e.g. the place `_1.f` is a projection,
with `f` being the "projection element and `_1` being the base with `f` being the "projection element and `_1` being the base

View File

@ -14,7 +14,7 @@ placed into metadata.
Once you have a use-site like Once you have a use-site like
```rust ```rust,ignore
type Foo = [u8; FOO - 42]; type Foo = [u8; FOO - 42];
``` ```
@ -24,7 +24,7 @@ create items that use the type (locals, constants, function arguments, ...).
To obtain the (in this case empty) parameter environment, one can call To obtain the (in this case empty) parameter environment, one can call
`let param_env = tcx.param_env(length_def_id);`. The `GlobalId` needed is `let param_env = tcx.param_env(length_def_id);`. The `GlobalId` needed is
```rust ```rust,ignore
let gid = GlobalId { let gid = GlobalId {
promoted: None, promoted: None,
instance: Instance::mono(length_def_id), instance: Instance::mono(length_def_id),

View File

@ -41,7 +41,7 @@ To invoke a query is simple. The tcx ("type context") offers a method
for each defined query. So, for example, to invoke the `type_of` for each defined query. So, for example, to invoke the `type_of`
query, you would just do this: query, you would just do this:
```rust ```rust,ignore
let ty = tcx.type_of(some_def_id); let ty = tcx.type_of(some_def_id);
``` ```
@ -59,7 +59,7 @@ better user experience. In order to recover from a cycle, you don't
get to use the nice method-call-style syntax. Instead, you invoke get to use the nice method-call-style syntax. Instead, you invoke
using the `try_get` method, which looks roughly like this: using the `try_get` method, which looks roughly like this:
```rust ```rust,ignore
use ty::maps::queries; use ty::maps::queries;
... ...
match queries::type_of::try_get(tcx, DUMMY_SP, self.did) { match queries::type_of::try_get(tcx, DUMMY_SP, self.did) {
@ -87,7 +87,7 @@ will be reported due to this cycle by some other bit of code. In that
case, you can invoke `err.cancel()` to not emit any error. It is case, you can invoke `err.cancel()` to not emit any error. It is
traditional to then invoke: traditional to then invoke:
``` ```rust,ignore
tcx.sess.delay_span_bug(some_span, "some message") tcx.sess.delay_span_bug(some_span, "some message")
``` ```
@ -126,7 +126,7 @@ on how that works).
Providers always have the same signature: Providers always have the same signature:
```rust ```rust,ignore
fn provider<'cx, 'tcx>(tcx: TyCtxt<'cx, 'tcx, 'tcx>, fn provider<'cx, 'tcx>(tcx: TyCtxt<'cx, 'tcx, 'tcx>,
key: QUERY_KEY) key: QUERY_KEY)
-> QUERY_RESULT -> QUERY_RESULT
@ -146,7 +146,7 @@ When the tcx is created, it is given the providers by its creator using
the `Providers` struct. This struct is generated by the macros here, but it the `Providers` struct. This struct is generated by the macros here, but it
is basically a big list of function pointers: is basically a big list of function pointers:
```rust ```rust,ignore
struct Providers { struct Providers {
type_of: for<'cx, 'tcx> fn(TyCtxt<'cx, 'tcx, 'tcx>, DefId) -> Ty<'tcx>, type_of: for<'cx, 'tcx> fn(TyCtxt<'cx, 'tcx, 'tcx>, DefId) -> Ty<'tcx>,
... ...
@ -163,7 +163,7 @@ throughout the other `rustc_*` crates. This is done by invoking
various `provide` functions. These functions tend to look something various `provide` functions. These functions tend to look something
like this: like this:
```rust ```rust,ignore
pub fn provide(providers: &mut Providers) { pub fn provide(providers: &mut Providers) {
*providers = Providers { *providers = Providers {
type_of, type_of,
@ -180,7 +180,7 @@ before.) So, if we want to add a provider for some other query,
let's call it `fubar`, into the crate above, we might modify the `provide()` let's call it `fubar`, into the crate above, we might modify the `provide()`
function like so: function like so:
```rust ```rust,ignore
pub fn provide(providers: &mut Providers) { pub fn provide(providers: &mut Providers) {
*providers = Providers { *providers = Providers {
type_of, type_of,
@ -189,7 +189,7 @@ pub fn provide(providers: &mut Providers) {
}; };
} }
fn fubar<'cx, 'tcx>(tcx: TyCtxt<'cx, 'tcx>, key: DefId) -> Fubar<'tcx> { .. } fn fubar<'cx, 'tcx>(tcx: TyCtxt<'cx, 'tcx>, key: DefId) -> Fubar<'tcx> { ... }
``` ```
N.B. Most of the `rustc_*` crates only provide **local N.B. Most of the `rustc_*` crates only provide **local
@ -218,7 +218,7 @@ something like:
[maps-mod]: https://doc.rust-lang.org/nightly/nightly-rustc/rustc/ty/maps/index.html [maps-mod]: https://doc.rust-lang.org/nightly/nightly-rustc/rustc/ty/maps/index.html
``` ```rust,ignore
define_maps! { <'tcx> define_maps! { <'tcx>
/// Records the type of every item. /// Records the type of every item.
[] fn type_of: TypeOfItem(DefId) -> Ty<'tcx>, [] fn type_of: TypeOfItem(DefId) -> Ty<'tcx>,
@ -229,7 +229,7 @@ define_maps! { <'tcx>
Each line of the macro defines one query. The name is broken up like this: Each line of the macro defines one query. The name is broken up like this:
``` ```rust,ignore
[] fn type_of: TypeOfItem(DefId) -> Ty<'tcx>, [] fn type_of: TypeOfItem(DefId) -> Ty<'tcx>,
^^ ^^^^^^^ ^^^^^^^^^^ ^^^^^ ^^^^^^^^ ^^ ^^^^^^^ ^^^^^^^^^^ ^^^^^ ^^^^^^^^
| | | | | | | | | |
@ -288,7 +288,7 @@ describing the query. Each such struct implements the
key/value of that particular query. Basically the code generated looks something key/value of that particular query. Basically the code generated looks something
like this: like this:
```rust ```rust,ignore
// Dummy struct representing a particular kind of query: // Dummy struct representing a particular kind of query:
pub struct type_of<'tcx> { phantom: PhantomData<&'tcx ()> } pub struct type_of<'tcx> { phantom: PhantomData<&'tcx ()> }
@ -306,7 +306,7 @@ this trait is optional if the query key is `DefId`, but if you *don't*
implement it, you get a pretty generic error ("processing `foo`..."). implement it, you get a pretty generic error ("processing `foo`...").
You can put new impls into the `config` module. They look something like this: You can put new impls into the `config` module. They look something like this:
```rust ```rust,ignore
impl<'tcx> QueryDescription for queries::type_of<'tcx> { impl<'tcx> QueryDescription for queries::type_of<'tcx> {
fn describe(tcx: TyCtxt, key: DefId) -> String { fn describe(tcx: TyCtxt, key: DefId) -> String {
format!("computing the type of `{}`", tcx.item_path_str(key)) format!("computing the type of `{}`", tcx.item_path_str(key))

View File

@ -70,7 +70,7 @@ The main process of crate crawling is done in `clean/mod.rs` through several
implementations of the `Clean` trait defined within. This is a conversion implementations of the `Clean` trait defined within. This is a conversion
trait, which defines one method: trait, which defines one method:
```rust ```rust,ignore
pub trait Clean<T> { pub trait Clean<T> {
fn clean(&self, cx: &DocContext) -> T; fn clean(&self, cx: &DocContext) -> T;
} }

View File

@ -100,7 +100,7 @@ are normally put after the short comment that explains the point of
this test. For example, this test uses the `// compile-flags` command this test. For example, this test uses the `// compile-flags` command
to specify a custom flag to give to rustc when the test is compiled: to specify a custom flag to give to rustc when the test is compiled:
```rust ```rust,ignore
// Copyright 2017 The Rust Project Developers. blah blah blah. // Copyright 2017 The Rust Project Developers. blah blah blah.
// ... // ...
// except according to those terms. // except according to those terms.
@ -198,7 +198,7 @@ incremental, though incremental tests are somewhat
different). Revisions allow a single test file to be used for multiple different). Revisions allow a single test file to be used for multiple
tests. This is done by adding a special header at the top of the file: tests. This is done by adding a special header at the top of the file:
``` ```rust
// revisions: foo bar baz // revisions: foo bar baz
``` ```
@ -211,7 +211,7 @@ You can also customize headers and expected error messages to a particular
revision. To do this, add `[foo]` (or `bar`, `baz`, etc) after the `//` revision. To do this, add `[foo]` (or `bar`, `baz`, etc) after the `//`
comment, like so: comment, like so:
``` ```rust
// A flag to pass in only for cfg `foo`: // A flag to pass in only for cfg `foo`:
//[foo]compile-flags: -Z verbose //[foo]compile-flags: -Z verbose
@ -284,7 +284,7 @@ between platforms, mainly about filenames:
Sometimes these built-in normalizations are not enough. In such cases, you Sometimes these built-in normalizations are not enough. In such cases, you
may provide custom normalization rules using the header commands, e.g. may provide custom normalization rules using the header commands, e.g.
``` ```rust
// normalize-stdout-test: "foo" -> "bar" // normalize-stdout-test: "foo" -> "bar"
// normalize-stderr-32bit: "fn\(\) \(32 bits\)" -> "fn\(\) \($$PTR bits\)" // normalize-stderr-32bit: "fn\(\) \(32 bits\)" -> "fn\(\) \($$PTR bits\)"
// normalize-stderr-64bit: "fn\(\) \(64 bits\)" -> "fn\(\) \($$PTR bits\)" // normalize-stderr-64bit: "fn\(\) \(64 bits\)" -> "fn\(\) \($$PTR bits\)"
@ -298,7 +298,7 @@ default regex flavor provided by `regex` crate.
The corresponding reference file will use the normalized output to test both The corresponding reference file will use the normalized output to test both
32-bit and 64-bit platforms: 32-bit and 64-bit platforms:
``` ```txt
... ...
| |
= note: source type: fn() ($PTR bits) = note: source type: fn() ($PTR bits)

View File

@ -3,8 +3,8 @@
You can run the tests using `x.py`. The most basic command -- which You can run the tests using `x.py`. The most basic command -- which
you will almost never want to use! -- is as follows: you will almost never want to use! -- is as follows:
``` ```bash
./x.py test > ./x.py test
``` ```
This will build the full stage 2 compiler and then run the whole test This will build the full stage 2 compiler and then run the whole test
@ -17,7 +17,7 @@ The test results are cached and previously successful tests are
`ignored` during testing. The stdout/stderr contents as well as a `ignored` during testing. The stdout/stderr contents as well as a
timestamp file for every test can be found under `build/ARCH/test/`. timestamp file for every test can be found under `build/ARCH/test/`.
To force-rerun a test (e.g. in case the test runner fails to notice To force-rerun a test (e.g. in case the test runner fails to notice
a change) you can simply remove the timestamp file. a change) you can simply remove the timestamp file.
## Running a subset of the test suites ## Running a subset of the test suites
@ -27,7 +27,7 @@ test" that can be used after modifying rustc to see if things are
generally working correctly would be the following: generally working correctly would be the following:
```bash ```bash
./x.py test --stage 1 src/test/{ui,compile-fail,run-pass} > ./x.py test --stage 1 src/test/{ui,compile-fail,run-pass}
``` ```
This will run the `ui`, `compile-fail`, and `run-pass` test suites, This will run the `ui`, `compile-fail`, and `run-pass` test suites,
@ -37,7 +37,7 @@ example, if you are hacking on debuginfo, you may be better off with
the debuginfo test suite: the debuginfo test suite:
```bash ```bash
./x.py test --stage 1 src/test/debuginfo > ./x.py test --stage 1 src/test/debuginfo
``` ```
**Warning:** Note that bors only runs the tests with the full stage 2 **Warning:** Note that bors only runs the tests with the full stage 2
@ -51,8 +51,8 @@ Another common thing that people want to do is to run an **individual
test**, often the test they are trying to fix. One way to do this is test**, often the test they are trying to fix. One way to do this is
to invoke `x.py` with the `--test-args` option: to invoke `x.py` with the `--test-args` option:
``` ```bash
./x.py test --stage 1 src/test/ui --test-args issue-1234 > ./x.py test --stage 1 src/test/ui --test-args issue-1234
``` ```
Under the hood, the test runner invokes the standard rust test runner Under the hood, the test runner invokes the standard rust test runner
@ -62,8 +62,8 @@ filtering for tests that include "issue-1234" in the name.
Often, though, it's easier to just run the test by hand. Most tests are Often, though, it's easier to just run the test by hand. Most tests are
just `rs` files, so you can do something like just `rs` files, so you can do something like
``` ```bash
rustc +stage1 src/test/ui/issue-1234.rs > rustc +stage1 src/test/ui/issue-1234.rs
``` ```
This is much faster, but doesn't always work. For example, some tests This is much faster, but doesn't always work. For example, some tests

View File

@ -26,7 +26,7 @@ possible impl is this one, with def-id 22:
[selection process]: ./trait-resolution.html#selection [selection process]: ./trait-resolution.html#selection
```rust ```rust,ignore
impl Foo<isize> for usize { ... } // Impl #22 impl Foo<isize> for usize { ... } // Impl #22
``` ```

View File

@ -18,14 +18,14 @@ trait Foo<X> {
Let's say we have a function `want_hrtb` that wants a type which Let's say we have a function `want_hrtb` that wants a type which
implements `Foo<&'a isize>` for any `'a`: implements `Foo<&'a isize>` for any `'a`:
```rust ```rust,ignore
fn want_hrtb<T>() where T : for<'a> Foo<&'a isize> { ... } fn want_hrtb<T>() where T : for<'a> Foo<&'a isize> { ... }
``` ```
Now we have a struct `AnyInt` that implements `Foo<&'a isize>` for any Now we have a struct `AnyInt` that implements `Foo<&'a isize>` for any
`'a`: `'a`:
```rust ```rust,ignore
struct AnyInt; struct AnyInt;
impl<'a> Foo<&'a isize> for AnyInt { } impl<'a> Foo<&'a isize> for AnyInt { }
``` ```
@ -71,7 +71,7 @@ set for `'0` is `{'0, '$a}`, and hence the check will succeed.
Let's consider a failure case. Imagine we also have a struct Let's consider a failure case. Imagine we also have a struct
```rust ```rust,ignore
struct StaticInt; struct StaticInt;
impl Foo<&'static isize> for StaticInt; impl Foo<&'static isize> for StaticInt;
``` ```

View File

@ -13,13 +13,13 @@ see [*this* traits chapter](./traits.html).
Trait resolution is the process of pairing up an impl with each Trait resolution is the process of pairing up an impl with each
reference to a trait. So, for example, if there is a generic function like: reference to a trait. So, for example, if there is a generic function like:
```rust ```rust,ignore
fn clone_slice<T:Clone>(x: &[T]) -> Vec<T> { /*...*/ } fn clone_slice<T:Clone>(x: &[T]) -> Vec<T> { ... }
``` ```
and then a call to that function: and then a call to that function:
```rust ```rust,ignore
let v: Vec<isize> = clone_slice(&[1, 2, 3]) let v: Vec<isize> = clone_slice(&[1, 2, 3])
``` ```
@ -30,7 +30,7 @@ Note that in some cases, like generic functions, we may not be able to
find a specific impl, but we can figure out that the caller must find a specific impl, but we can figure out that the caller must
provide an impl. For example, consider the body of `clone_slice`: provide an impl. For example, consider the body of `clone_slice`:
```rust ```rust,ignore
fn clone_slice<T:Clone>(x: &[T]) -> Vec<T> { fn clone_slice<T:Clone>(x: &[T]) -> Vec<T> {
let mut v = Vec::new(); let mut v = Vec::new();
for e in &x { for e in &x {
@ -143,7 +143,7 @@ otherwise the result is considered ambiguous.
This process is easier if we work through some examples. Consider This process is easier if we work through some examples. Consider
the following trait: the following trait:
```rust ```rust,ignore
trait Convert<Target> { trait Convert<Target> {
fn convert(&self) -> Target; fn convert(&self) -> Target;
} }
@ -154,14 +154,14 @@ converts from the (implicit) `Self` type to the `Target` type. If we
wanted to permit conversion between `isize` and `usize`, we might wanted to permit conversion between `isize` and `usize`, we might
implement `Convert` like so: implement `Convert` like so:
```rust ```rust,ignore
impl Convert<usize> for isize { /*...*/ } // isize -> usize impl Convert<usize> for isize { ... } // isize -> usize
impl Convert<isize> for usize { /*...*/ } // usize -> isize impl Convert<isize> for usize { ... } // usize -> isize
``` ```
Now imagine there is some code like the following: Now imagine there is some code like the following:
```rust ```rust,ignore
let x: isize = ...; let x: isize = ...;
let y = x.convert(); let y = x.convert();
``` ```
@ -186,7 +186,7 @@ inference?
But what happens if there are multiple impls where all the types But what happens if there are multiple impls where all the types
unify? Consider this example: unify? Consider this example:
```rust ```rust,ignore
trait Get { trait Get {
fn get(&self) -> Self; fn get(&self) -> Self;
} }
@ -224,11 +224,11 @@ the same trait (or some subtrait) and which can match against the obligation.
Consider this simple example: Consider this simple example:
```rust ```rust,ignore
trait A1 { trait A1 {
fn do_a1(&self); fn do_a1(&self);
} }
trait A2 : A1 { /*...*/ } trait A2 : A1 { ... }
trait B { trait B {
fn do_b(&self); fn do_b(&self);
@ -256,13 +256,13 @@ values found in the obligation, possibly yielding a type error.
Suppose we have the following variation of the `Convert` example in the Suppose we have the following variation of the `Convert` example in the
previous section: previous section:
```rust ```rust,ignore
trait Convert<Target> { trait Convert<Target> {
fn convert(&self) -> Target; fn convert(&self) -> Target;
} }
impl Convert<usize> for isize { /*...*/ } // isize -> usize impl Convert<usize> for isize { ... } // isize -> usize
impl Convert<isize> for usize { /*...*/ } // usize -> isize impl Convert<isize> for usize { ... } // usize -> isize
let x: isize = ...; let x: isize = ...;
let y: char = x.convert(); // NOTE: `y: char` now! let y: char = x.convert(); // NOTE: `y: char` now!
@ -296,11 +296,11 @@ everything out.
Here is an example: Here is an example:
```rust ```rust,ignore
trait Foo { /*...*/ } trait Foo { ... }
impl<U,T:Bar<U>> Foo for Vec<T> { /*...*/ } impl<U, T:Bar<U>> Foo for Vec<T> { ... }
impl Bar<usize> for isize { /*...*/ } impl Bar<usize> for isize { ... }
``` ```
After one shallow round of selection for an obligation like `Vec<isize> After one shallow round of selection for an obligation like `Vec<isize>

View File

@ -29,10 +29,10 @@ that is, simplified -- based on the types given in an impl. So, to
continue with our example, the impl of `IntoIterator` for `Option<T>` continue with our example, the impl of `IntoIterator` for `Option<T>`
declares (among other things) that `Item = T`: declares (among other things) that `Item = T`:
```rust ```rust,ignore
impl<T> IntoIterator for Option<T> { impl<T> IntoIterator for Option<T> {
type Item = T; type Item = T;
.. ...
} }
``` ```
@ -51,9 +51,11 @@ In our logic, normalization is defined by a predicate
impls. For example, the `impl` of `IntoIterator` for `Option<T>` that impls. For example, the `impl` of `IntoIterator` for `Option<T>` that
we saw above would be lowered to a program clause like so: we saw above would be lowered to a program clause like so:
forall<T> { ```txt
Normalize(<Option<T> as IntoIterator>::Item -> T) forall<T> {
} Normalize(<Option<T> as IntoIterator>::Item -> T)
}
```
(An aside: since we do not permit quantification over traits, this is (An aside: since we do not permit quantification over traits, this is
really more like a family of predicates, one for each associated really more like a family of predicates, one for each associated
@ -67,7 +69,7 @@ we've seen so far.
Sometimes however we want to work with associated types that cannot be Sometimes however we want to work with associated types that cannot be
normalized. For example, consider this function: normalized. For example, consider this function:
```rust ```rust,ignore
fn foo<T: IntoIterator>(...) { ... } fn foo<T: IntoIterator>(...) { ... }
``` ```
@ -99,20 +101,24 @@ consider an associated type projection equal to another type?":
We now introduce the `ProjectionEq` predicate to bring those two cases We now introduce the `ProjectionEq` predicate to bring those two cases
together. The `ProjectionEq` predicate looks like so: together. The `ProjectionEq` predicate looks like so:
ProjectionEq(<T as IntoIterator>::Item = U) ```txt
ProjectionEq(<T as IntoIterator>::Item = U)
```
and we will see that it can be proven *either* via normalization or and we will see that it can be proven *either* via normalization or
skolemization. As part of lowering an associated type declaration from skolemization. As part of lowering an associated type declaration from
some trait, we create two program clauses for `ProjectionEq`: some trait, we create two program clauses for `ProjectionEq`:
forall<T, U> { ```txt
ProjectionEq(<T as IntoIterator>::Item = U) :- forall<T, U> {
Normalize(<T as IntoIterator>::Item -> U) ProjectionEq(<T as IntoIterator>::Item = U) :-
} Normalize(<T as IntoIterator>::Item -> U)
}
forall<T> { forall<T> {
ProjectionEq(<T as IntoIterator>::Item = (IntoIterator::Item)<T>) ProjectionEq(<T as IntoIterator>::Item = (IntoIterator::Item)<T>)
} }
```
These are the only two `ProjectionEq` program clauses we ever make for These are the only two `ProjectionEq` program clauses we ever make for
any given associated item. any given associated item.
@ -124,7 +130,9 @@ with unification. As described in the
[type inference](./type-inference.html) section, unification is [type inference](./type-inference.html) section, unification is
basically a procedure with a signature like this: basically a procedure with a signature like this:
Unify(A, B) = Result<(Subgoals, RegionConstraints), NoSolution> ```txt
Unify(A, B) = Result<(Subgoals, RegionConstraints), NoSolution>
```
In other words, we try to unify two things A and B. That procedure In other words, we try to unify two things A and B. That procedure
might just fail, in which case we get back `Err(NoSolution)`. This might just fail, in which case we get back `Err(NoSolution)`. This

View File

@ -19,12 +19,16 @@ In a traditional Prolog system, when you start a query, the solver
will run off and start supplying you with every possible answer it can will run off and start supplying you with every possible answer it can
find. So given something like this: find. So given something like this:
?- Vec<i32>: AsRef<?U> ```txt
?- Vec<i32>: AsRef<?U>
```
The solver might answer: The solver might answer:
Vec<i32>: AsRef<[i32]> ```txt
continue? (y/n) Vec<i32>: AsRef<[i32]>
continue? (y/n)
```
This `continue` bit is interesting. The idea in Prolog is that the This `continue` bit is interesting. The idea in Prolog is that the
solver is finding **all possible** instantiations of your query that solver is finding **all possible** instantiations of your query that
@ -35,34 +39,42 @@ response with our original query -- Rust's solver gives back a
substitution instead). If we were to hit `y`, the solver might then substitution instead). If we were to hit `y`, the solver might then
give us another possible answer: give us another possible answer:
Vec<i32>: AsRef<Vec<i32>> ```txt
continue? (y/n) Vec<i32>: AsRef<Vec<i32>>
continue? (y/n)
```
This answer derives from the fact that there is a reflexive impl This answer derives from the fact that there is a reflexive impl
(`impl<T> AsRef<T> for T`) for `AsRef`. If were to hit `y` again, (`impl<T> AsRef<T> for T`) for `AsRef`. If were to hit `y` again,
then we might get back a negative response: then we might get back a negative response:
no ```txt
no
```
Naturally, in some cases, there may be no possible answers, and hence Naturally, in some cases, there may be no possible answers, and hence
the solver will just give me back `no` right away: the solver will just give me back `no` right away:
?- Box<i32>: Copy ```txt
no ?- Box<i32>: Copy
no
```
In some cases, there might be an infinite number of responses. So for In some cases, there might be an infinite number of responses. So for
example if I gave this query, and I kept hitting `y`, then the solver example if I gave this query, and I kept hitting `y`, then the solver
would never stop giving me back answers: would never stop giving me back answers:
?- Vec<?U>: Clone ```txt
Vec<i32>: Clone ?- Vec<?U>: Clone
continue? (y/n) Vec<i32>: Clone
Vec<Box<i32>>: Clone continue? (y/n)
continue? (y/n) Vec<Box<i32>>: Clone
Vec<Box<Box<i32>>>: Clone continue? (y/n)
continue? (y/n) Vec<Box<Box<i32>>>: Clone
Vec<Box<Box<Box<i32>>>>: Clone continue? (y/n)
continue? (y/n) Vec<Box<Box<Box<i32>>>>: Clone
continue? (y/n)
```
As you can imagine, the solver will gleefully keep adding another As you can imagine, the solver will gleefully keep adding another
layer of `Box` until we ask it to stop, or it runs out of memory. layer of `Box` until we ask it to stop, or it runs out of memory.
@ -70,12 +82,16 @@ layer of `Box` until we ask it to stop, or it runs out of memory.
Another interesting thing is that queries might still have variables Another interesting thing is that queries might still have variables
in them. For example: in them. For example:
?- Rc<?T>: Clone ```txt
?- Rc<?T>: Clone
```
might produce the answer: might produce the answer:
Rc<?T>: Clone ```txt
continue? (y/n) Rc<?T>: Clone
continue? (y/n)
```
After all, `Rc<?T>` is true **no matter what type `?T` is**. After all, `Rc<?T>` is true **no matter what type `?T` is**.
@ -132,7 +148,7 @@ impls; among them, there are these two (for clarify, I've written the
[borrow]: https://doc.rust-lang.org/std/borrow/trait.Borrow.html [borrow]: https://doc.rust-lang.org/std/borrow/trait.Borrow.html
```rust ```rust,ignore
impl<T> Borrow<T> for T where T: ?Sized impl<T> Borrow<T> for T where T: ?Sized
impl<T> Borrow<[T]> for Vec<T> where T: Sized impl<T> Borrow<[T]> for Vec<T> where T: Sized
``` ```
@ -140,7 +156,7 @@ impl<T> Borrow<[T]> for Vec<T> where T: Sized
**Example 1.** Imagine we are type-checking this (rather artificial) **Example 1.** Imagine we are type-checking this (rather artificial)
bit of code: bit of code:
```rust ```rust,ignore
fn foo<A, B>(a: A, vec_b: Option<B>) where A: Borrow<B> { } fn foo<A, B>(a: A, vec_b: Option<B>) where A: Borrow<B> { }
fn main() { fn main() {
@ -185,7 +201,7 @@ other sources, in which case we can try the trait query again.
**Example 2.** We can now extend our previous example a bit, **Example 2.** We can now extend our previous example a bit,
and assign a value to `u`: and assign a value to `u`:
```rust ```rust,ignore
fn foo<A, B>(a: A, vec_b: Option<B>) where A: Borrow<B> { } fn foo<A, B>(a: A, vec_b: Option<B>) where A: Borrow<B> { }
fn main() { fn main() {
@ -210,11 +226,15 @@ Let's suppose that the type checker decides to revisit the
Borrow<?U>`. `?U` is no longer an unbound inference variable; it now Borrow<?U>`. `?U` is no longer an unbound inference variable; it now
has a value, `Vec<?V>`. So, if we "refresh" the query with that value, we get: has a value, `Vec<?V>`. So, if we "refresh" the query with that value, we get:
Vec<?T>: Borrow<Vec<?V>> ```txt
Vec<?T>: Borrow<Vec<?V>>
```
This time, there is only one impl that applies, the reflexive impl: This time, there is only one impl that applies, the reflexive impl:
impl<T> Borrow<T> for T where T: ?Sized ```txt
impl<T> Borrow<T> for T where T: ?Sized
```
Therefore, the trait checker will answer: Therefore, the trait checker will answer:

View File

@ -42,14 +42,18 @@ This query contains two unbound variables, but it also contains the
lifetime `'static`. The trait system generally ignores all lifetimes lifetime `'static`. The trait system generally ignores all lifetimes
and treats them equally, so when canonicalizing, we will *also* and treats them equally, so when canonicalizing, we will *also*
replace any [free lifetime](./appendix-background.html#free-vs-bound) with a replace any [free lifetime](./appendix-background.html#free-vs-bound) with a
canonical variable. Therefore, we get the following result: canonical variable. Therefore, we get the following result:
?0: Foo<'?1, ?2> ```txt
?0: Foo<'?1, ?2>
Sometimes we write this differently, like so: ```
Sometimes we write this differently, like so:
```txt
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
for<T,L,T> { ?0: Foo<'?1, ?2> }
This `for<>` gives some information about each of the canonical This `for<>` gives some information about each of the canonical
variables within. In this case, each `T` indicates a type variable, variables within. In this case, each `T` indicates a type variable,
so `?0` and `?2` are types; the `L` indicates a lifetime varibale, so so `?0` and `?2` are types; the `L` indicates a lifetime varibale, so
@ -57,8 +61,10 @@ so `?0` and `?2` are types; the `L` indicates a lifetime varibale, so
`CanonicalVarValues` array OV with the "original values" for each `CanonicalVarValues` array OV with the "original values" for each
canonicalized variable: canonicalized variable:
[?A, 'static, ?B] ```txt
[?A, 'static, ?B]
```
We'll need this vector OV later, when we process the query response. We'll need this vector OV later, when we process the query response.
## Executing the query ## Executing the query
@ -70,18 +76,24 @@ we create a substitution S from the canonical form containing a fresh
inference variable (of suitable kind) for each canonical variable. inference variable (of suitable kind) for each canonical variable.
So, for our example query: So, for our example query:
for<T,L,T> { ?0: Foo<'?1, ?2> } ```txt
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
the substitution S might be: the substitution S might be:
S = [?A, '?B, ?C] ```txt
S = [?A, '?B, ?C]
```
We can then replace the bound canonical variables (`?0`, etc) with We can then replace the bound canonical variables (`?0`, etc) with
these inference variables, yielding the following fully instantiated these inference variables, yielding the following fully instantiated
query: query:
?A: Foo<'?B, ?C> ```txt
?A: Foo<'?B, ?C>
```
Remember that substitution S though! We're going to need it later. Remember that substitution S though! We're going to need it later.
OK, now that we have a fresh inference context and an instantiated OK, now that we have a fresh inference context and an instantiated
@ -93,7 +105,7 @@ created. For example, if there were only one impl of `Foo`, like so:
[cqqr]: ./traits-canonical-queries.html#query-response [cqqr]: ./traits-canonical-queries.html#query-response
``` ```rust,ignore
impl<'a, X> Foo<'a, X> for Vec<X> impl<'a, X> Foo<'a, X> for Vec<X>
where X: 'a where X: 'a
{ ... } { ... }
@ -123,39 +135,49 @@ result substitution `var_values`, and some region constraints. To
create this, we wind up re-using the substitution S that we created create this, we wind up re-using the substitution S that we created
when first instantiating our query. To refresh your memory, we had a query when first instantiating our query. To refresh your memory, we had a query
for<T,L,T> { ?0: Foo<'?1, ?2> } ```txt
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
for which we made a substutition S: for which we made a substutition S:
S = [?A, '?B, ?C] ```txt
S = [?A, '?B, ?C]
```
We then did some work which unified some of those variables with other things. We then did some work which unified some of those variables with other things.
If we "refresh" S with the latest results, we get: If we "refresh" S with the latest results, we get:
S = [Vec<?E>, '?D, ?E] ```txt
S = [Vec<?E>, '?D, ?E]
```
These are precisely the new values for the three input variables from These are precisely the new values for the three input variables from
our original query. Note though that they include some new variables our original query. Note though that they include some new variables
(like `?E`). We can make those go away by canonicalizing again! We don't (like `?E`). We can make those go away by canonicalizing again! We don't
just canonicalize S, though, we canonicalize the whole query response QR: just canonicalize S, though, we canonicalize the whole query response QR:
QR = { ```txt
certainty: Proven, // or whatever QR = {
var_values: [Vec<?E>, '?D, ?E] // this is S certainty: Proven, // or whatever
region_constraints: [?E: '?D], // from the impl var_values: [Vec<?E>, '?D, ?E] // this is S
value: (), // for our purposes, just (), but region_constraints: [?E: '?D], // from the impl
// in some cases this might have value: (), // for our purposes, just (), but
// a type or other info // in some cases this might have
} // a type or other info
}
```
The result would be as follows: The result would be as follows:
Canonical(QR) = for<T, L> { ```txt
certainty: Proven, Canonical(QR) = for<T, L> {
var_values: [Vec<?0>, '?1, ?2] certainty: Proven,
region_constraints: [?2: '?1], var_values: [Vec<?0>, '?1, ?2]
value: (), region_constraints: [?2: '?1],
} value: (),
}
```
(One subtle point: when we canonicalize the query **result**, we do not (One subtle point: when we canonicalize the query **result**, we do not
use any special treatment for free lifetimes. Note that both use any special treatment for free lifetimes. Note that both
@ -172,20 +194,26 @@ In the previous section we produced a canonical query result. We now have
to apply that result in our original context. If you recall, way back in the to apply that result in our original context. If you recall, way back in the
beginning, we were trying to prove this query: beginning, we were trying to prove this query:
?A: Foo<'static, ?B> ```txt
?A: Foo<'static, ?B>
```
We canonicalized that into this: We canonicalized that into this:
for<T,L,T> { ?0: Foo<'?1, ?2> } ```txt
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
and now we got back a canonical response: and now we got back a canonical response:
for<T, L> { ```txt
certainty: Proven, for<T, L> {
var_values: [Vec<?0>, '?1, ?2] certainty: Proven,
region_constraints: [?2: '?1], var_values: [Vec<?0>, '?1, ?2]
value: (), region_constraints: [?2: '?1],
} value: (),
}
```
We now want to apply that response to our context. Conceptually, how We now want to apply that response to our context. Conceptually, how
we do that is to (a) instantiate each of the canonical variables in we do that is to (a) instantiate each of the canonical variables in
@ -193,19 +221,19 @@ the result with a fresh inference variable, (b) unify the values in
the result with the original values, and then (c) record the region the result with the original values, and then (c) record the region
constraints for later. Doing step (a) would yield a result of constraints for later. Doing step (a) would yield a result of
``` ```txt
{ {
certainty: Proven, certainty: Proven,
var_values: [Vec<?C>, '?D, ?C] var_values: [Vec<?C>, '?D, ?C]
^^ ^^^ fresh inference variables ^^ ^^^ fresh inference variables
region_constraints: [?C: '?D], region_constraints: [?C: '?D],
value: (), value: (),
} }
``` ```
Step (b) would then unify: Step (b) would then unify:
``` ```txt
?A with Vec<?C> ?A with Vec<?C>
'static with '?D 'static with '?D
?B with ?C ?B with ?C

View File

@ -12,22 +12,24 @@ a few new superpowers.
In Rust's solver, **goals** and **clauses** have the following forms In Rust's solver, **goals** and **clauses** have the following forms
(note that the two definitions reference one another): (note that the two definitions reference one another):
Goal = DomainGoal // defined in the section below ```txt
| Goal && Goal Goal = DomainGoal // defined in the section below
| Goal || Goal | Goal && Goal
| exists<K> { Goal } // existential quantification | Goal || Goal
| forall<K> { Goal } // universal quantification | exists<K> { Goal } // existential quantification
| if (Clause) { Goal } // implication | forall<K> { Goal } // universal quantification
| true // something that's trivially true | if (Clause) { Goal } // implication
| ambiguous // something that's never provable | true // something that's trivially true
| ambiguous // something that's never provable
Clause = DomainGoal Clause = DomainGoal
| Clause :- Goal // if can prove Goal, then Clause is true | Clause :- Goal // if can prove Goal, then Clause is true
| Clause && Clause | Clause && Clause
| forall<K> { Clause } | forall<K> { Clause }
K = <type> // a "kind" K = <type> // a "kind"
| <lifetime> | <lifetime>
```
The proof procedure for these sorts of goals is actually quite The proof procedure for these sorts of goals is actually quite
straightforward. Essentially, it's a form of depth-first search. The straightforward. Essentially, it's a form of depth-first search. The
@ -47,8 +49,10 @@ To define the set of *domain goals* in our system, we need to first
introduce a few simple formulations. A **trait reference** consists of introduce a few simple formulations. A **trait reference** consists of
the name of a trait along with a suitable set of inputs P0..Pn: the name of a trait along with a suitable set of inputs P0..Pn:
TraitRef = P0: TraitName<P1..Pn> ```txt
TraitRef = P0: TraitName<P1..Pn>
```
So, for example, `u32: Display` is a trait reference, as is `Vec<T>: So, for example, `u32: Display` is a trait reference, as is `Vec<T>:
IntoIterator`. Note that Rust surface syntax also permits some extra IntoIterator`. Note that Rust surface syntax also permits some extra
things, like associated type bindings (`Vec<T>: IntoIterator<Item = things, like associated type bindings (`Vec<T>: IntoIterator<Item =
@ -59,20 +63,24 @@ T>`), that are not part of a trait reference.
A **projection** consists of an associated item reference along with A **projection** consists of an associated item reference along with
its inputs P0..Pm: its inputs P0..Pm:
Projection = <P0 as TraitName<P1..Pn>>::AssocItem<Pn+1..Pm> ```txt
Projection = <P0 as TraitName<P1..Pn>>::AssocItem<Pn+1..Pm>
```
Given that, we can define a `DomainGoal` as follows: Given that, we can define a `DomainGoal` as follows:
DomainGoal = Implemented(TraitRef) ```txt
| ProjectionEq(Projection = Type) DomainGoal = Implemented(TraitRef)
| Normalize(Projection -> Type) | ProjectionEq(Projection = Type)
| FromEnv(TraitRef) | Normalize(Projection -> Type)
| FromEnv(Projection = Type) | FromEnv(TraitRef)
| WellFormed(Type) | FromEnv(Projection = Type)
| WellFormed(TraitRef) | WellFormed(Type)
| WellFormed(Projection = Type) | WellFormed(TraitRef)
| Outlives(Type, Region) | WellFormed(Projection = Type)
| Outlives(Region, Region) | Outlives(Type, Region)
| Outlives(Region, Region)
```
- `Implemented(TraitRef)` -- true if the given trait is - `Implemented(TraitRef)` -- true if the given trait is
implemented for the given input types and lifetimes implemented for the given input types and lifetimes
@ -104,9 +112,11 @@ Given that, we can define a `DomainGoal` as follows:
Most goals in our system are "inductive". In an inductive goal, Most goals in our system are "inductive". In an inductive goal,
circular reasoning is disallowed. Consider this example clause: circular reasoning is disallowed. Consider this example clause:
```txt
Implemented(Foo: Bar) :- Implemented(Foo: Bar) :-
Implemented(Foo: Bar). Implemented(Foo: Bar).
```
Considered inductively, this clause is useless: if we are trying to Considered inductively, this clause is useless: if we are trying to
prove `Implemented(Foo: Bar)`, we would then recursively have to prove prove `Implemented(Foo: Bar)`, we would then recursively have to prove
`Implemented(Foo: Bar)`, and that cycle would continue ad infinitum `Implemented(Foo: Bar)`, and that cycle would continue ad infinitum
@ -130,8 +140,10 @@ struct Foo {
The default rules for auto traits say that `Foo` is `Send` if the The default rules for auto traits say that `Foo` is `Send` if the
types of its fields are `Send`. Therefore, we would have a rule like types of its fields are `Send`. Therefore, we would have a rule like
Implemented(Foo: Send) :- ```txt
Implemented(Option<Box<Foo>>: Send). Implemented(Foo: Send) :-
Implemented(Option<Box<Foo>>: Send).
```
As you can probably imagine, proving that `Option<Box<Foo>>: Send` is As you can probably imagine, proving that `Option<Box<Foo>>: Send` is
going to wind up circularly requiring us to prove that `Foo: Send` going to wind up circularly requiring us to prove that `Foo: Send`

View File

@ -49,7 +49,7 @@ standard [ui test] mechanisms to check them. In this case, there is a
need only be a prefix of the error), but [the stderr file] contains need only be a prefix of the error), but [the stderr file] contains
the full details: the full details:
``` ```txt
error: Implemented(T: Foo) :- ProjectionEq(<T as std::iter::Iterator>::Item == i32), TypeOutlives(T \ error: Implemented(T: Foo) :- ProjectionEq(<T as std::iter::Iterator>::Item == i32), TypeOutlives(T \
: 'static), Implemented(T: std::iter::Iterator), Implemented(T: std::marker::Sized). : 'static), Implemented(T: std::iter::Iterator), Implemented(T: std::marker::Sized).
--> $DIR/lower_impl.rs:15:1 --> $DIR/lower_impl.rs:15:1

View File

@ -67,7 +67,7 @@ but Chalk isn't modeling those right now.
Given a trait definition Given a trait definition
```rust ```rust,ignore
trait Trait<P1..Pn> // P0 == Self trait Trait<P1..Pn> // P0 == Self
where WC where WC
{ {
@ -87,10 +87,12 @@ relationships between different kinds of domain goals. The first such
rule from the trait header creates the mapping between the `FromEnv` rule from the trait header creates the mapping between the `FromEnv`
and `Implemented` predicates: and `Implemented` predicates:
// Rule Implemented-From-Env ```txt
forall<Self, P1..Pn> { // Rule Implemented-From-Env
Implemented(Self: Trait<P1..Pn>) :- FromEnv(Self: Trait<P1..Pn>) forall<Self, P1..Pn> {
} Implemented(Self: Trait<P1..Pn>) :- FromEnv(Self: Trait<P1..Pn>)
}
```
<a name="implied-bounds"> <a name="implied-bounds">
@ -101,17 +103,19 @@ The next few clauses have to do with implied bounds (see also
[RFC 2089]: https://rust-lang.github.io/rfcs/2089-implied-bounds.html [RFC 2089]: https://rust-lang.github.io/rfcs/2089-implied-bounds.html
// Rule Implied-Bound-From-Trait ```txt
// // Rule Implied-Bound-From-Trait
// For each where clause WC: //
forall<Self, P1..Pn> { // For each where clause WC:
FromEnv(WC) :- FromEnv(Self: Trait<P1..Pn) forall<Self, P1..Pn> {
} FromEnv(WC) :- FromEnv(Self: Trait<P1..Pn)
}
```
This clause says that if we are assuming that the trait holds, then we can also This clause says that if we are assuming that the trait holds, then we can also
assume that it's where-clauses hold. It's perhaps useful to see an example: assume that it's where-clauses hold. It's perhaps useful to see an example:
```rust ```rust,ignore
trait Eq: PartialEq { ... } trait Eq: PartialEq { ... }
``` ```
@ -145,7 +149,7 @@ all the where clauses that are transitively implied by `T: Trait`.
An example: An example:
```rust ```rust,ignore
trait Foo: A + Bar { } trait Foo: A + Bar { }
trait Bar: B + Foo { } trait Bar: B + Foo { }
trait A { } trait A { }
@ -180,7 +184,7 @@ items.
Given a trait that declares a (possibly generic) associated type: Given a trait that declares a (possibly generic) associated type:
```rust ```rust,ignore
trait Trait<P1..Pn> // P0 == Self trait Trait<P1..Pn> // P0 == Self
where WC where WC
{ {
@ -190,39 +194,43 @@ where WC
We will produce a number of program clauses. The first two define We will produce a number of program clauses. The first two define
the rules by which `ProjectionEq` can succeed; these two clauses are discussed the rules by which `ProjectionEq` can succeed; these two clauses are discussed
in detail in the [section on associated types](./traits-associated-types.html),, in detail in the [section on associated types](./traits-associated-types.html),
but reproduced here for reference: but reproduced here for reference:
// Rule ProjectionEq-Normalize ```txt
// // Rule ProjectionEq-Normalize
// ProjectionEq can succeed by normalizing: //
forall<Self, P1..Pn, Pn+1..Pm, U> { // ProjectionEq can succeed by normalizing:
ProjectionEq(<Self as Trait<P1..Pn>>::AssocType<Pn+1..Pm> = U) :- forall<Self, P1..Pn, Pn+1..Pm, U> {
Normalize(<Self as Trait<P1..Pn>>::AssocType<Pn+1..Pm> -> U) ProjectionEq(<Self as Trait<P1..Pn>>::AssocType<Pn+1..Pm> = U) :-
} Normalize(<Self as Trait<P1..Pn>>::AssocType<Pn+1..Pm> -> U)
}
// Rule ProjectionEq-Skolemize // Rule ProjectionEq-Skolemize
// //
// ProjectionEq can succeed by skolemizing, see "associated type" // ProjectionEq can succeed by skolemizing, see "associated type"
// chapter for more: // chapter for more:
forall<Self, P1..Pn, Pn+1..Pm> { forall<Self, P1..Pn, Pn+1..Pm> {
ProjectionEq( ProjectionEq(
<Self as Trait<P1..Pn>>::AssocType<Pn+1..Pm> = <Self as Trait<P1..Pn>>::AssocType<Pn+1..Pm> =
(Trait::AssocType)<Self, P1..Pn, Pn+1..Pm> (Trait::AssocType)<Self, P1..Pn, Pn+1..Pm>
) :- ) :-
// But only if the trait is implemented, and the conditions from // But only if the trait is implemented, and the conditions from
// the associated type are met as well: // the associated type are met as well:
Implemented(Self: Trait<P1..Pn>) Implemented(Self: Trait<P1..Pn>)
&& WC1 && WC1
} }
```
The next rule covers implied bounds for the projection. In particular, The next rule covers implied bounds for the projection. In particular,
the `Bounds` declared on the associated type must be proven to hold to the `Bounds` declared on the associated type must be proven to hold to
show that the impl is well-formed, and hence we can rely on them show that the impl is well-formed, and hence we can rely on them
elsewhere. elsewhere.
// XXX how exactly should we set this up? Have to be careful; ```txt
// presumably this has to be a kind of `FromEnv` setup. // XXX how exactly should we set this up? Have to be careful;
// presumably this has to be a kind of `FromEnv` setup.
```
### Lowering function and constant declarations ### Lowering function and constant declarations
@ -234,7 +242,7 @@ values below](#constant-vals) for more details.
Given an impl of a trait: Given an impl of a trait:
```rust ```rust,ignore
impl<P0..Pn> Trait<A1..An> for A0 impl<P0..Pn> Trait<A1..An> for A0
where WC where WC
{ {
@ -245,10 +253,12 @@ where WC
Let `TraitRef` be the trait reference `A0: Trait<A1..An>`. Then we Let `TraitRef` be the trait reference `A0: Trait<A1..An>`. Then we
will create the following rules: will create the following rules:
// Rule Implemented-From-Impl ```txt
forall<P0..Pn> { // Rule Implemented-From-Impl
Implemented(TraitRef) :- WC forall<P0..Pn> {
} Implemented(TraitRef) :- WC
}
```
In addition, we will lower all of the *impl items*. In addition, we will lower all of the *impl items*.
@ -258,7 +268,7 @@ In addition, we will lower all of the *impl items*.
Given an impl that contains: Given an impl that contains:
```rust ```rust,ignore
impl<P0..Pn> Trait<A1..An> for A0 impl<P0..Pn> Trait<A1..An> for A0
where WC where WC
{ {
@ -268,13 +278,15 @@ where WC
We produce the following rule: We produce the following rule:
// Rule Normalize-From-Impl ```txt
forall<P0..Pm> { // Rule Normalize-From-Impl
forall<Pn+1..Pm> { forall<P0..Pm> {
Normalize(<A0 as Trait<A1..An>>::AssocType<Pn+1..Pm> -> T) :- forall<Pn+1..Pm> {
WC && WC1 Normalize(<A0 as Trait<A1..An>>::AssocType<Pn+1..Pm> -> T) :-
} WC && WC1
} }
}
```
Note that `WC` and `WC1` both encode where-clauses that the impl can Note that `WC` and `WC1` both encode where-clauses that the impl can
rely on. rely on.

View File

@ -30,7 +30,7 @@ impl<T> Clone for Vec<T> where T: Clone { }
We could map these declarations to some Horn clauses, written in a We could map these declarations to some Horn clauses, written in a
Prolog-like notation, as follows: Prolog-like notation, as follows:
``` ```txt
Clone(usize). Clone(usize).
Clone(Vec<?T>) :- Clone(?T). Clone(Vec<?T>) :- Clone(?T).
@ -51,18 +51,18 @@ so by applying the rules recursively:
- `Clone(Vec<Vec<usize>>)` is provable if: - `Clone(Vec<Vec<usize>>)` is provable if:
- `Clone(Vec<usize>)` is provable if: - `Clone(Vec<usize>)` is provable if:
- `Clone(usize)` is provable. (Which is is, so we're all good.) - `Clone(usize)` is provable. (Which is is, so we're all good.)
But now suppose we tried to prove that `Clone(Vec<Bar>)`. This would But now suppose we tried to prove that `Clone(Vec<Bar>)`. This would
fail (after all, I didn't give an impl of `Clone` for `Bar`): fail (after all, I didn't give an impl of `Clone` for `Bar`):
- `Clone(Vec<Bar>)` is provable if: - `Clone(Vec<Bar>)` is provable if:
- `Clone(Bar)` is provable. (But it is not, as there are no applicable rules.) - `Clone(Bar)` is provable. (But it is not, as there are no applicable rules.)
We can easily extend the example above to cover generic traits with We can easily extend the example above to cover generic traits with
more than one input type. So imagine the `Eq<T>` trait, which declares more than one input type. So imagine the `Eq<T>` trait, which declares
that `Self` is equatable with a value of type `T`: that `Self` is equatable with a value of type `T`:
```rust ```rust,ignore
trait Eq<T> { ... } trait Eq<T> { ... }
impl Eq<usize> for usize { } impl Eq<usize> for usize { }
impl<T: Eq<U>> Eq<Vec<U>> for Vec<T> { } impl<T: Eq<U>> Eq<Vec<U>> for Vec<T> { }
@ -70,12 +70,12 @@ impl<T: Eq<U>> Eq<Vec<U>> for Vec<T> { }
That could be mapped as follows: That could be mapped as follows:
``` ```txt
Eq(usize, usize). Eq(usize, usize).
Eq(Vec<?T>, Vec<?U>) :- Eq(?T, ?U). Eq(Vec<?T>, Vec<?U>) :- Eq(?T, ?U).
``` ```
So far so good. So far so good.
## Type-checking normal functions ## Type-checking normal functions
@ -90,7 +90,7 @@ that we need to prove, and those come from type-checking.
Consider type-checking the function `foo()` here: Consider type-checking the function `foo()` here:
```rust ```rust,ignore
fn foo() { bar::<usize>() } fn foo() { bar::<usize>() }
fn bar<U: Eq<U>>() { } fn bar<U: Eq<U>>() { }
``` ```
@ -105,7 +105,7 @@ If we wanted, we could write a Prolog predicate that defines the
conditions under which `bar()` can be called. We'll say that those conditions under which `bar()` can be called. We'll say that those
conditions are called being "well-formed": conditions are called being "well-formed":
``` ```txt
barWellFormed(?U) :- Eq(?U, ?U). barWellFormed(?U) :- Eq(?U, ?U).
``` ```
@ -113,7 +113,7 @@ Then we can say that `foo()` type-checks if the reference to
`bar::<usize>` (that is, `bar()` applied to the type `usize`) is `bar::<usize>` (that is, `bar()` applied to the type `usize`) is
well-formed: well-formed:
``` ```txt
fooTypeChecks :- barWellFormed(usize). fooTypeChecks :- barWellFormed(usize).
``` ```
@ -122,7 +122,7 @@ If we try to prove the goal `fooTypeChecks`, it will succeed:
- `fooTypeChecks` is provable if: - `fooTypeChecks` is provable if:
- `barWellFormed(usize)`, which is provable if: - `barWellFormed(usize)`, which is provable if:
- `Eq(usize, usize)`, which is provable because of an impl. - `Eq(usize, usize)`, which is provable because of an impl.
Ok, so far so good. Let's move on to type-checking a more complex function. Ok, so far so good. Let's move on to type-checking a more complex function.
## Type-checking generic functions: beyond Horn clauses ## Type-checking generic functions: beyond Horn clauses
@ -134,7 +134,7 @@ a generic function, it turns out we need a stronger notion of goal than Prolog
can be provide. To see what I'm talking about, let's revamp our previous can be provide. To see what I'm talking about, let's revamp our previous
example to make `foo` generic: example to make `foo` generic:
```rust ```rust,ignore
fn foo<T: Eq<T>>() { bar::<T>() } fn foo<T: Eq<T>>() { bar::<T>() }
fn bar<U: Eq<U>>() { } fn bar<U: Eq<U>>() { }
``` ```
@ -144,7 +144,7 @@ To type-check the body of `foo`, we need to be able to hold the type
type-safe *for all types `T`*, not just for some specific type. We might express type-safe *for all types `T`*, not just for some specific type. We might express
this like so: this like so:
``` ```txt
fooTypeChecks :- fooTypeChecks :-
// for all types T... // for all types T...
forall<T> { forall<T> {

View File

@ -10,7 +10,7 @@ The `tcx` ("typing context") is the central data structure in the
compiler. It is the context that you use to perform all manner of compiler. It is the context that you use to perform all manner of
queries. The struct `TyCtxt` defines a reference to this shared context: queries. The struct `TyCtxt` defines a reference to this shared context:
```rust ```rust,ignore
tcx: TyCtxt<'a, 'gcx, 'tcx> tcx: TyCtxt<'a, 'gcx, 'tcx>
// -- ---- ---- // -- ---- ----
// | | | // | | |
@ -47,7 +47,7 @@ for the `'gcx` and `'tcx` parameters of `TyCtxt`. Just to be a touch
confusing, we tend to use the name `'tcx` in such contexts. Here is an confusing, we tend to use the name `'tcx` in such contexts. Here is an
example: example:
```rust ```rust,ignore
fn not_in_inference<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>, def_id: DefId) { fn not_in_inference<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>, def_id: DefId) {
// ---- ---- // ---- ----
// Using the same lifetime here asserts // Using the same lifetime here asserts
@ -59,7 +59,7 @@ fn not_in_inference<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>, def_id: DefId) {
In contrast, if we want to code that can be usable during type inference, then In contrast, if we want to code that can be usable during type inference, then
you need to declare a distinct `'gcx` and `'tcx` lifetime parameter: you need to declare a distinct `'gcx` and `'tcx` lifetime parameter:
```rust ```rust,ignore
fn maybe_in_inference<'a, 'gcx, 'tcx>(tcx: TyCtxt<'a, 'gcx, 'tcx>, def_id: DefId) { fn maybe_in_inference<'a, 'gcx, 'tcx>(tcx: TyCtxt<'a, 'gcx, 'tcx>, def_id: DefId) {
// ---- ---- // ---- ----
// Using different lifetimes here means that // Using different lifetimes here means that
@ -74,7 +74,7 @@ Rust types are represented using the `Ty<'tcx>` defined in the `ty`
module (not to be confused with the `Ty` struct from [the HIR]). This module (not to be confused with the `Ty` struct from [the HIR]). This
is in fact a simple type alias for a reference with `'tcx` lifetime: is in fact a simple type alias for a reference with `'tcx` lifetime:
```rust ```rust,ignore
pub type Ty<'tcx> = &'tcx TyS<'tcx>; pub type Ty<'tcx> = &'tcx TyS<'tcx>;
``` ```
@ -89,7 +89,7 @@ the rustc arenas (never e.g. on the stack).
One common operation on types is to **match** and see what kinds of One common operation on types is to **match** and see what kinds of
types they are. This is done by doing `match ty.sty`, sort of like this: types they are. This is done by doing `match ty.sty`, sort of like this:
```rust ```rust,ignore
fn test_type<'tcx>(ty: Ty<'tcx>) { fn test_type<'tcx>(ty: Ty<'tcx>) {
match ty.sty { match ty.sty {
ty::TyArray(elem_ty, len) => { ... } ty::TyArray(elem_ty, len) => { ... }
@ -111,7 +111,7 @@ To allocate a new type, you can use the various `mk_` methods defined
on the `tcx`. These have names that correpond mostly to the various kinds on the `tcx`. These have names that correpond mostly to the various kinds
of type variants. For example: of type variants. For example:
```rust ```rust,ignore
let array_ty = tcx.mk_array(elem_ty, len * 2); let array_ty = tcx.mk_array(elem_ty, len * 2);
``` ```
@ -158,7 +158,7 @@ module. Here are a few examples:
Although there is no hard and fast rule, the `ty` module tends to be used like Although there is no hard and fast rule, the `ty` module tends to be used like
so: so:
```rust ```rust,ignore
use ty::{self, Ty, TyCtxt}; use ty::{self, Ty, TyCtxt};
``` ```

View File

@ -17,9 +17,9 @@ similar conversions for where-clauses and other bits of the function signature.
To try and get a sense for the difference, consider this function: To try and get a sense for the difference, consider this function:
```rust ```rust,ignore
struct Foo { } struct Foo { }
fn foo(x: Foo, y: self::Foo) { .. } fn foo(x: Foo, y: self::Foo) { ... }
// ^^^ ^^^^^^^^^ // ^^^ ^^^^^^^^^
``` ```

View File

@ -21,7 +21,7 @@ signature, such as the `'a` in `for<'a> fn(&'a u32)`. A region is
You create and "enter" an inference context by doing something like You create and "enter" an inference context by doing something like
the following: the following:
```rust ```rust,ignore
tcx.infer_ctxt().enter(|infcx| { tcx.infer_ctxt().enter(|infcx| {
// Use the inference context `infcx` here. // Use the inference context `infcx` here.
}) })
@ -88,7 +88,7 @@ The most basic operations you can perform in the type inferencer is
recommended way to add an equality constraint is to use the `at` recommended way to add an equality constraint is to use the `at`
method, roughly like so: method, roughly like so:
```rust ```rust,ignore
infcx.at(...).eq(t, u); infcx.at(...).eq(t, u);
``` ```
@ -159,7 +159,9 @@ is to first "generalize" `&'a i32` into a type with a region variable:
`&'?b i32`, and then unify `?T` with that (`?T = &'?b i32`). We then `&'?b i32`, and then unify `?T` with that (`?T = &'?b i32`). We then
relate this new variable with the original bound: relate this new variable with the original bound:
&'?b i32 <: &'a i32 ```txt
&'?b i32 <: &'a i32
```
This will result in a region constraint (see below) of `'?b: 'a`. This will result in a region constraint (see below) of `'?b: 'a`.
@ -176,12 +178,16 @@ eagerly unifying things, we simply collect constraints as we go, but
make (almost) no attempt to solve regions. These constraints have the make (almost) no attempt to solve regions. These constraints have the
form of an "outlives" constraint: form of an "outlives" constraint:
'a: 'b ```txt
'a: 'b
```
Actually the code tends to view them as a subregion relation, but it's the same Actually the code tends to view them as a subregion relation, but it's the same
idea: idea:
'b <= 'a ```txt
'b <= 'a
```
(There are various other kinds of constriants, such as "verifys"; see (There are various other kinds of constriants, such as "verifys"; see
the `region_constraints` module for details.) the `region_constraints` module for details.)
@ -189,7 +195,9 @@ the `region_constraints` module for details.)
There is one case where we do some amount of eager unification. If you have an There is one case where we do some amount of eager unification. If you have an
equality constraint between two regions equality constraint between two regions
'a = 'b ```txt
'a = 'b
```
we will record that fact in a unification table. You can then use we will record that fact in a unification table. You can then use
`opportunistic_resolve_var` to convert `'b` to `'a` (or vice `opportunistic_resolve_var` to convert `'b` to `'a` (or vice

View File

@ -54,7 +54,7 @@ constraints will be satisfied.
As a simple example, consider: As a simple example, consider:
```rust ```rust,ignore
enum Option<A> { Some(A), None } enum Option<A> { Some(A), None }
enum OptionalFn<B> { Some(|B|), None } enum OptionalFn<B> { Some(|B|), None }
enum OptionalMap<C> { Some(|C| -> C), None } enum OptionalMap<C> { Some(|C| -> C), None }
@ -62,19 +62,23 @@ enum OptionalMap<C> { Some(|C| -> C), None }
Here, we will generate the constraints: Here, we will generate the constraints:
1. V(A) <= + ```txt
2. V(B) <= - 1. V(A) <= +
3. V(C) <= + 2. V(B) <= -
4. V(C) <= - 3. V(C) <= +
4. V(C) <= -
```
These indicate that (1) the variance of A must be at most covariant; These indicate that (1) the variance of A must be at most covariant;
(2) the variance of B must be at most contravariant; and (3, 4) the (2) the variance of B must be at most contravariant; and (3, 4) the
variance of C must be at most covariant *and* contravariant. All of these variance of C must be at most covariant *and* contravariant. All of these
results are based on a variance lattice defined as follows: results are based on a variance lattice defined as follows:
* Top (bivariant) ```txt
- + * Top (bivariant)
o Bottom (invariant) - +
o Bottom (invariant)
```txt
Based on this lattice, the solution `V(A)=+`, `V(B)=-`, `V(C)=o` is the Based on this lattice, the solution `V(A)=+`, `V(B)=-`, `V(C)=o` is the
optimal solution. Note that there is always a naive solution which optimal solution. Note that there is always a naive solution which
@ -85,8 +89,10 @@ is that the variance of a use site may itself be a function of the
variance of other type parameters. In full generality, our constraints variance of other type parameters. In full generality, our constraints
take the form: take the form:
V(X) <= Term ```txt
Term := + | - | * | o | V(X) | Term x Term V(X) <= Term
Term := + | - | * | o | V(X) | Term x Term
```
Here the notation `V(X)` indicates the variance of a type/region Here the notation `V(X)` indicates the variance of a type/region
parameter `X` with respect to its defining class. `Term x Term` parameter `X` with respect to its defining class. `Term x Term`
@ -101,7 +107,7 @@ represents the "variance transform" as defined in the paper:
If I have a struct or enum with where clauses: If I have a struct or enum with where clauses:
```rust ```rust,ignore
struct Foo<T: Bar> { ... } struct Foo<T: Bar> { ... }
``` ```
@ -170,9 +176,11 @@ another.
To see what I mean, consider a trait like so: To see what I mean, consider a trait like so:
trait ConvertTo<A> { ```rust
fn convertTo(&self) -> A; trait ConvertTo<A> {
} fn convertTo(&self) -> A;
}
```
Intuitively, If we had one object `O=&ConvertTo<Object>` and another Intuitively, If we had one object `O=&ConvertTo<Object>` and another
`S=&ConvertTo<String>`, then `S <: O` because `String <: Object` `S=&ConvertTo<String>`, then `S <: O` because `String <: Object`
@ -200,20 +208,24 @@ But traits aren't only used with objects. They're also used when
deciding whether a given impl satisfies a given trait bound. To set the deciding whether a given impl satisfies a given trait bound. To set the
scene here, imagine I had a function: scene here, imagine I had a function:
fn convertAll<A,T:ConvertTo<A>>(v: &[T]) { ```rust,ignore
... fn convertAll<A,T:ConvertTo<A>>(v: &[T]) { ... }
} ```
Now imagine that I have an implementation of `ConvertTo` for `Object`: Now imagine that I have an implementation of `ConvertTo` for `Object`:
impl ConvertTo<i32> for Object { ... } ```rust,ignore
impl ConvertTo<i32> for Object { ... }
```
And I want to call `convertAll` on an array of strings. Suppose And I want to call `convertAll` on an array of strings. Suppose
further that for whatever reason I specifically supply the value of further that for whatever reason I specifically supply the value of
`String` for the type parameter `T`: `String` for the type parameter `T`:
let mut vector = vec!["string", ...]; ```rust,ignore
convertAll::<i32, String>(vector); let mut vector = vec!["string", ...];
convertAll::<i32, String>(vector);
```
Is this legal? To put another way, can we apply the `impl` for Is this legal? To put another way, can we apply the `impl` for
`Object` to the type `String`? The answer is yes, but to see why `Object` to the type `String`? The answer is yes, but to see why
@ -222,11 +234,9 @@ we have to expand out what will happen:
- `convertAll` will create a pointer to one of the entries in the - `convertAll` will create a pointer to one of the entries in the
vector, which will have type `&String` vector, which will have type `&String`
- It will then call the impl of `convertTo()` that is intended - It will then call the impl of `convertTo()` that is intended
for use with objects. This has the type: for use with objects. This has the type `fn(self: &Object) -> i32`.
fn(self: &Object) -> i32 It is OK to provide a value for `self` of type `&String` because
It is ok to provide a value for `self` of type `&String` because
`&String <: &Object`. `&String <: &Object`.
OK, so intuitively we want this to be legal, so let's bring this back OK, so intuitively we want this to be legal, so let's bring this back
@ -238,11 +248,15 @@ Maybe it's helpful to think of a dictionary-passing implementation of
type classes. In that case, `convertAll()` takes an implicit parameter type classes. In that case, `convertAll()` takes an implicit parameter
representing the impl. In short, we *have* an impl of type: representing the impl. In short, we *have* an impl of type:
V_O = ConvertTo<i32> for Object ```txt
V_O = ConvertTo<i32> for Object
```
and the function prototype expects an impl of type: and the function prototype expects an impl of type:
V_S = ConvertTo<i32> for String ```txt
V_S = ConvertTo<i32> for String
```
As with any argument, this is legal if the type of the value given As with any argument, this is legal if the type of the value given
(`V_O`) is a subtype of the type expected (`V_S`). So is `V_O <: V_S`? (`V_O`) is a subtype of the type expected (`V_S`). So is `V_O <: V_S`?
@ -250,9 +264,11 @@ The answer will depend on the variance of the various parameters. In
this case, because the `Self` parameter is contravariant and `A` is this case, because the `Self` parameter is contravariant and `A` is
covariant, it means that: covariant, it means that:
V_O <: V_S iff ```txt
i32 <: i32 V_O <: V_S iff
String <: Object i32 <: i32
String <: Object
```
These conditions are satisfied and so we are happy. These conditions are satisfied and so we are happy.
@ -263,7 +279,9 @@ expressions -- must be invariant with respect to all of their
inputs. To see why this makes sense, consider what subtyping for a inputs. To see why this makes sense, consider what subtyping for a
trait reference means: trait reference means:
<T as Trait> <: <U as Trait> ```txt
<T as Trait> <: <U as Trait>
```
means that if I know that `T as Trait`, I also know that `U as means that if I know that `T as Trait`, I also know that `U as
Trait`. Moreover, if you think of it as dictionary passing style, Trait`. Moreover, if you think of it as dictionary passing style,
@ -279,7 +297,7 @@ Another related reason is that if we didn't make traits with
associated types invariant, then projection is no longer a associated types invariant, then projection is no longer a
function with a single result. Consider: function with a single result. Consider:
``` ```rust,ignore
trait Identity { type Out; fn foo(&self); } trait Identity { type Out; fn foo(&self); }
impl<T> Identity for T { type Out = T; ... } impl<T> Identity for T { type Out = T; ... }
``` ```
@ -287,9 +305,11 @@ impl<T> Identity for T { type Out = T; ... }
Now if I have `<&'static () as Identity>::Out`, this can be Now if I have `<&'static () as Identity>::Out`, this can be
validly derived as `&'a ()` for any `'a`: validly derived as `&'a ()` for any `'a`:
<&'a () as Identity> <: <&'static () as Identity> ```txt
if &'static () < : &'a () -- Identity is contravariant in Self <&'a () as Identity> <: <&'static () as Identity>
if 'static : 'a -- Subtyping rules for relations if &'static () < : &'a () -- Identity is contravariant in Self
if 'static : 'a -- Subtyping rules for relations
```
This change otoh means that `<'static () as Identity>::Out` is This change otoh means that `<'static () as Identity>::Out` is
always `&'static ()` (which might then be upcast to `'a ()`, always `&'static ()` (which might then be upcast to `'a ()`,