Replace txt with text

This commit is contained in:
varkor 2018-04-06 19:35:21 +01:00 committed by Who? Me?!
parent 43341c8894
commit 8889dcd15d
17 changed files with 101 additions and 101 deletions

View File

@ -177,7 +177,7 @@ foo.rs` (assuming you have a Rust program called `foo.rs`. You can also pass any
command line arguments that you would normally pass to rustc). When you run it
you'll see output similar to
```txt
```text
In crate: foo,
Found 12 uses of `println!`;

View File

@ -19,7 +19,7 @@ compilation improves, that may change.)
The dependency structure of these crates is roughly a diamond:
```txt
```text
rustc_driver
/ | \
/ | \

View File

@ -48,7 +48,7 @@ the graph. You can filter in three ways:
To filter, use the `RUST_DEP_GRAPH_FILTER` environment variable, which should
look like one of the following:
```txt
```text
source_filter // nodes originating from source_filter
-> target_filter // nodes that can reach target_filter
source_filter -> target_filter // nodes in between source_filter and target_filter
@ -58,14 +58,14 @@ source_filter -> target_filter // nodes in between source_filter and target_filt
A node is considered to match a filter if all of those strings appear in its
label. So, for example:
```txt
```text
RUST_DEP_GRAPH_FILTER='-> TypeckTables'
```
would select the predecessors of all `TypeckTables` nodes. Usually though you
want the `TypeckTables` node for some particular fn, so you might write:
```txt
```text
RUST_DEP_GRAPH_FILTER='-> TypeckTables & bar'
```
@ -75,7 +75,7 @@ with `bar` in their name.
Perhaps you are finding that when you change `foo` you need to re-type-check
`bar`, but you don't think you should have to. In that case, you might do:
```txt
```text
RUST_DEP_GRAPH_FILTER='Hir & foo -> TypeckTables & bar'
```
@ -105,7 +105,7 @@ check of `bar` and you don't think there should be. You dump the
dep-graph as described in the previous section and open `dep-graph.txt`
to see something like:
```txt
```text
Hir(foo) -> Collect(bar)
Collect(bar) -> TypeckTables(bar)
```

View File

@ -99,7 +99,7 @@ So, let's continue our example. Imagine that we were calling a method
that defines it with `&self` for the type `Rc<U>` as well as a method
on the type `Box` that defines `Foo` but with `&mut self`. Then we
might have two candidates:
```txt
```text
&Rc<Box<[T; 3]>> from the impl of `Foo` for `Rc<U>` where `U=Box<T; 3]>
&mut Box<[T; 3]>> from the inherent impl on `Box<U>` where `U=[T; 3]`
```

View File

@ -52,7 +52,7 @@ fn main() {
The files have names like `rustc.main.000-000.CleanEndRegions.after.mir`. These
names have a number of parts:
```txt
```text
rustc.main.000-000.CleanEndRegions.after.mir
---- --- --- --------------- ----- either before or after
| | | name of the pass
@ -159,7 +159,7 @@ ensuring that the reads have already happened (remember that
[queries are memoized](./query.html), so executing a query twice
simply loads from a cache the second time):
```txt
```text
mir_const(D) --read-by--> mir_const_qualif(D)
| ^
stolen-by |

View File

@ -122,7 +122,7 @@ stack, for example). But *how* do we reject it and *why*?
When we type-check `main`, and in particular the call `bar(foo)`, we
are going to wind up with a subtyping relationship like this one:
```txt
```text
fn(&'static u32) <: for<'a> fn(&'a u32)
---------------- -------------------
the type of `foo` the type `bar` expects
@ -137,7 +137,7 @@ regions" -- they represent, basically, "some unknown region".
Once we've done that replacement, we have the following relation:
```txt
```text
fn(&'static u32) <: fn(&'!1 u32)
```
@ -151,7 +151,7 @@ subtypes, we check if their arguments have the desired relationship
(fn arguments are [contravariant](./appendix-background.html#variance), so
we swap the left and right here):
```txt
```text
&'!1 u32 <: &'static u32
```
@ -201,7 +201,7 @@ Here, the name `'b` is not part of the root universe. Instead, when we
"enter" into this `for<'b>` (e.g., by skolemizing it), we will create
a child universe of the root, let's call it U1:
```txt
```text
U0 (root universe)
└─ U1 (child universe)
@ -224,7 +224,7 @@ When we enter *this* type, we will again create a new universe, which
we'll call `U2`. Its parent will be the root universe, and U1 will be
its sibling:
```txt
```text
U0 (root universe)
├─ U1 (child universe)
@ -263,7 +263,7 @@ children, that inference variable X would have to be in U0. And since
X is in U0, it cannot name anything from U1 (or U2). This is perhaps easiest
to see by using a kind of generic "logic" example:
```txt
```text
exists<X> {
forall<Y> { ... /* Y is in U1 ... */ }
forall<Z> { ... /* Z is in U2 ... */ }
@ -296,7 +296,7 @@ does not say region elements **will** appear.
In the region inference engine, outlives constraints have the form:
```txt
```text
V1: V2 @ P
```
@ -346,7 +346,7 @@ for universal regions from the fn signature.)
Put another way, the "universal regions" check can be considered to be
checking constraints like:
```txt
```text
{skol(1)}: V1
```
@ -358,13 +358,13 @@ made to represent the `!1` region.
OK, so far so good. Now let's walk through what would happen with our
first example:
```txt
```text
fn(&'static u32) <: fn(&'!1 u32) @ P // this point P is not imp't here
```
The region inference engine will create a region element domain like this:
```txt
```text
{ CFG; end('static); skol(1) }
--- ------------ ------- from the universe `!1`
| 'static is always in scope
@ -375,20 +375,20 @@ It will always create two universal variables, one representing
`'static` and one representing `'!1`. Let's call them Vs and V1. They
will have initial values like so:
```txt
```text
Vs = { CFG; end('static) } // it is in U0, so can't name anything else
V1 = { skol(1) }
```
From the subtyping constraint above, we would have an outlives constraint like
```txt
```text
'!1: 'static @ P
```
To process this, we would grow the value of V1 to include all of Vs:
```txt
```text
Vs = { CFG; end('static) }
V1 = { CFG; end('static), skol(1) }
```
@ -405,7 +405,7 @@ In this case, `V1` *did* grow too large -- it is not known to outlive
What about this subtyping relationship?
```txt
```text
for<'a> fn(&'a u32, &'a u32)
<:
for<'b, 'c> fn(&'b u32, &'c u32)
@ -413,7 +413,7 @@ for<'b, 'c> fn(&'b u32, &'c u32)
Here we would skolemize the supertype, as before, yielding:
```txt
```text
for<'a> fn(&'a u32, &'a u32)
<:
fn(&'!1 u32, &'!2 u32)
@ -423,7 +423,7 @@ then we instantiate the variable on the left-hand side with an
existential in universe U2, yielding the following (`?n` is a notation
for an existential variable):
```txt
```text
fn(&'?3 u32, &'?3 u32)
<:
fn(&'!1 u32, &'!2 u32)
@ -431,14 +431,14 @@ fn(&'!1 u32, &'!2 u32)
Then we break this down further:
```txt
```text
&'!1 u32 <: &'?3 u32
&'!2 u32 <: &'?3 u32
```
and even further, yield up our region constraints:
```txt
```text
'!1: '?3
'!2: '?3
```
@ -465,7 +465,7 @@ common lifetime of our arguments. -nmatsakis)
Let's look at one last example. We'll extend the previous one to have
a return type:
```txt
```text
for<'a> fn(&'a u32, &'a u32) -> &'a u32
<:
for<'b, 'c> fn(&'b u32, &'c u32) -> &'b u32
@ -478,7 +478,7 @@ first one. That is unsound. Let's see how it plays out.
First, we skolemize the supertype:
```txt
```text
for<'a> fn(&'a u32, &'a u32) -> &'a u32
<:
fn(&'!1 u32, &'!2 u32) -> &'!1 u32
@ -486,7 +486,7 @@ fn(&'!1 u32, &'!2 u32) -> &'!1 u32
Then we instantiate the subtype with existentials (in U2):
```txt
```text
fn(&'?3 u32, &'?3 u32) -> &'?3 u32
<:
fn(&'!1 u32, &'!2 u32) -> &'!1 u32
@ -494,7 +494,7 @@ fn(&'!1 u32, &'!2 u32) -> &'!1 u32
And now we create the subtyping relationships:
```txt
```text
&'!1 u32 <: &'?3 u32 // arg 1
&'!2 u32 <: &'?3 u32 // arg 2
&'?3 u32 <: &'!1 u32 // return type
@ -503,7 +503,7 @@ And now we create the subtyping relationships:
And finally the outlives relationships. Here, let V1, V2, and V3 be the
variables we assign to `!1`, `!2`, and `?3` respectively:
```txt
```text
V1: V3
V2: V3
V3: V1
@ -511,7 +511,7 @@ V3: V1
Those variables will have these initial values:
```txt
```text
V1 in U1 = {skol(1)}
V2 in U2 = {skol(2)}
V3 in U2 = {}
@ -520,14 +520,14 @@ V3 in U2 = {}
Now because of the `V3: V1` constraint, we have to add `skol(1)` into `V3` (and
indeed it is visible from `V3`), so we get:
```txt
```text
V3 in U2 = {skol(1)}
```
then we have this constraint `V2: V3`, so we wind up having to enlarge
`V2` to include `skol(1)` (which it can also see):
```txt
```text
V2 in U2 = {skol(1), skol(2)}
```

View File

@ -159,7 +159,7 @@ _3 = &mut _1;
Assignments in general have the form:
```txt
```text
<Place> = <Rvalue>
```
@ -169,7 +169,7 @@ value: in this case, the rvalue is a mutable borrow expression, which
looks like `&mut <Place>`. So we can kind of define a grammar for
rvalues like so:
```txt
```text
<Rvalue> = & (mut)? <Place>
| <Operand> + <Operand>
| <Operand> - <Operand>

View File

@ -298,7 +298,7 @@ default regex flavor provided by `regex` crate.
The corresponding reference file will use the normalized output to test both
32-bit and 64-bit platforms:
```txt
```text
...
|
= note: source type: fn() ($PTR bits)

View File

@ -51,7 +51,7 @@ In our logic, normalization is defined by a predicate
impls. For example, the `impl` of `IntoIterator` for `Option<T>` that
we saw above would be lowered to a program clause like so:
```txt
```text
forall<T> {
Normalize(<Option<T> as IntoIterator>::Item -> T)
}
@ -101,7 +101,7 @@ consider an associated type projection equal to another type?":
We now introduce the `ProjectionEq` predicate to bring those two cases
together. The `ProjectionEq` predicate looks like so:
```txt
```text
ProjectionEq(<T as IntoIterator>::Item = U)
```
@ -109,7 +109,7 @@ and we will see that it can be proven *either* via normalization or
skolemization. As part of lowering an associated type declaration from
some trait, we create two program clauses for `ProjectionEq`:
```txt
```text
forall<T, U> {
ProjectionEq(<T as IntoIterator>::Item = U) :-
Normalize(<T as IntoIterator>::Item -> U)
@ -130,7 +130,7 @@ with unification. As described in the
[type inference](./type-inference.html) section, unification is
basically a procedure with a signature like this:
```txt
```text
Unify(A, B) = Result<(Subgoals, RegionConstraints), NoSolution>
```

View File

@ -19,13 +19,13 @@ In a traditional Prolog system, when you start a query, the solver
will run off and start supplying you with every possible answer it can
find. So given something like this:
```txt
```text
?- Vec<i32>: AsRef<?U>
```
The solver might answer:
```txt
```text
Vec<i32>: AsRef<[i32]>
continue? (y/n)
```
@ -39,7 +39,7 @@ response with our original query -- Rust's solver gives back a
substitution instead). If we were to hit `y`, the solver might then
give us another possible answer:
```txt
```text
Vec<i32>: AsRef<Vec<i32>>
continue? (y/n)
```
@ -48,14 +48,14 @@ This answer derives from the fact that there is a reflexive impl
(`impl<T> AsRef<T> for T`) for `AsRef`. If were to hit `y` again,
then we might get back a negative response:
```txt
```text
no
```
Naturally, in some cases, there may be no possible answers, and hence
the solver will just give me back `no` right away:
```txt
```text
?- Box<i32>: Copy
no
```
@ -64,7 +64,7 @@ In some cases, there might be an infinite number of responses. So for
example if I gave this query, and I kept hitting `y`, then the solver
would never stop giving me back answers:
```txt
```text
?- Vec<?U>: Clone
Vec<i32>: Clone
continue? (y/n)
@ -82,13 +82,13 @@ layer of `Box` until we ask it to stop, or it runs out of memory.
Another interesting thing is that queries might still have variables
in them. For example:
```txt
```text
?- Rc<?T>: Clone
```
might produce the answer:
```txt
```text
Rc<?T>: Clone
continue? (y/n)
```
@ -226,13 +226,13 @@ Let's suppose that the type checker decides to revisit the
Borrow<?U>`. `?U` is no longer an unbound inference variable; it now
has a value, `Vec<?V>`. So, if we "refresh" the query with that value, we get:
```txt
```text
Vec<?T>: Borrow<Vec<?V>>
```
This time, there is only one impl that applies, the reflexive impl:
```txt
```text
impl<T> Borrow<T> for T where T: ?Sized
```

View File

@ -44,13 +44,13 @@ and treats them equally, so when canonicalizing, we will *also*
replace any [free lifetime](./appendix-background.html#free-vs-bound) with a
canonical variable. Therefore, we get the following result:
```txt
```text
?0: Foo<'?1, ?2>
```
Sometimes we write this differently, like so:
```txt
```text
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
@ -61,7 +61,7 @@ so `?0` and `?2` are types; the `L` indicates a lifetime varibale, so
`CanonicalVarValues` array OV with the "original values" for each
canonicalized variable:
```txt
```text
[?A, 'static, ?B]
```
@ -76,13 +76,13 @@ we create a substitution S from the canonical form containing a fresh
inference variable (of suitable kind) for each canonical variable.
So, for our example query:
```txt
```text
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
the substitution S might be:
```txt
```text
S = [?A, '?B, ?C]
```
@ -90,7 +90,7 @@ We can then replace the bound canonical variables (`?0`, etc) with
these inference variables, yielding the following fully instantiated
query:
```txt
```text
?A: Foo<'?B, ?C>
```
@ -135,20 +135,20 @@ result substitution `var_values`, and some region constraints. To
create this, we wind up re-using the substitution S that we created
when first instantiating our query. To refresh your memory, we had a query
```txt
```text
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
for which we made a substutition S:
```txt
```text
S = [?A, '?B, ?C]
```
We then did some work which unified some of those variables with other things.
If we "refresh" S with the latest results, we get:
```txt
```text
S = [Vec<?E>, '?D, ?E]
```
@ -157,7 +157,7 @@ our original query. Note though that they include some new variables
(like `?E`). We can make those go away by canonicalizing again! We don't
just canonicalize S, though, we canonicalize the whole query response QR:
```txt
```text
QR = {
certainty: Proven, // or whatever
var_values: [Vec<?E>, '?D, ?E] // this is S
@ -170,7 +170,7 @@ QR = {
The result would be as follows:
```txt
```text
Canonical(QR) = for<T, L> {
certainty: Proven,
var_values: [Vec<?0>, '?1, ?2]
@ -194,19 +194,19 @@ In the previous section we produced a canonical query result. We now have
to apply that result in our original context. If you recall, way back in the
beginning, we were trying to prove this query:
```txt
```text
?A: Foo<'static, ?B>
```
We canonicalized that into this:
```txt
```text
for<T,L,T> { ?0: Foo<'?1, ?2> }
```
and now we got back a canonical response:
```txt
```text
for<T, L> {
certainty: Proven,
var_values: [Vec<?0>, '?1, ?2]
@ -221,7 +221,7 @@ the result with a fresh inference variable, (b) unify the values in
the result with the original values, and then (c) record the region
constraints for later. Doing step (a) would yield a result of
```txt
```text
{
certainty: Proven,
var_values: [Vec<?C>, '?D, ?C]
@ -233,7 +233,7 @@ constraints for later. Doing step (a) would yield a result of
Step (b) would then unify:
```txt
```text
?A with Vec<?C>
'static with '?D
?B with ?C

View File

@ -12,7 +12,7 @@ a few new superpowers.
In Rust's solver, **goals** and **clauses** have the following forms
(note that the two definitions reference one another):
```txt
```text
Goal = DomainGoal // defined in the section below
| Goal && Goal
| Goal || Goal
@ -49,7 +49,7 @@ To define the set of *domain goals* in our system, we need to first
introduce a few simple formulations. A **trait reference** consists of
the name of a trait along with a suitable set of inputs P0..Pn:
```txt
```text
TraitRef = P0: TraitName<P1..Pn>
```
@ -63,13 +63,13 @@ T>`), that are not part of a trait reference.
A **projection** consists of an associated item reference along with
its inputs P0..Pm:
```txt
```text
Projection = <P0 as TraitName<P1..Pn>>::AssocItem<Pn+1..Pm>
```
Given that, we can define a `DomainGoal` as follows:
```txt
```text
DomainGoal = Implemented(TraitRef)
| ProjectionEq(Projection = Type)
| Normalize(Projection -> Type)
@ -112,7 +112,7 @@ DomainGoal = Implemented(TraitRef)
Most goals in our system are "inductive". In an inductive goal,
circular reasoning is disallowed. Consider this example clause:
```txt
```text
Implemented(Foo: Bar) :-
Implemented(Foo: Bar).
```
@ -140,7 +140,7 @@ struct Foo {
The default rules for auto traits say that `Foo` is `Send` if the
types of its fields are `Send`. Therefore, we would have a rule like
```txt
```text
Implemented(Foo: Send) :-
Implemented(Option<Box<Foo>>: Send).
```

View File

@ -49,7 +49,7 @@ standard [ui test] mechanisms to check them. In this case, there is a
need only be a prefix of the error), but [the stderr file] contains
the full details:
```txt
```text
error: Implemented(T: Foo) :- ProjectionEq(<T as std::iter::Iterator>::Item == i32), TypeOutlives(T \
: 'static), Implemented(T: std::iter::Iterator), Implemented(T: std::marker::Sized).
--> $DIR/lower_impl.rs:15:1

View File

@ -87,7 +87,7 @@ relationships between different kinds of domain goals. The first such
rule from the trait header creates the mapping between the `FromEnv`
and `Implemented` predicates:
```txt
```text
// Rule Implemented-From-Env
forall<Self, P1..Pn> {
Implemented(Self: Trait<P1..Pn>) :- FromEnv(Self: Trait<P1..Pn>)
@ -103,7 +103,7 @@ The next few clauses have to do with implied bounds (see also
[RFC 2089]: https://rust-lang.github.io/rfcs/2089-implied-bounds.html
```txt
```text
// Rule Implied-Bound-From-Trait
//
// For each where clause WC:
@ -130,7 +130,7 @@ clauses** but also things that follow from them.
The next rule is related; it defines what it means for a trait reference
to be **well-formed**:
```txt
```text
// Rule WellFormed-TraitRef
//
// For each where clause WC:
@ -197,7 +197,7 @@ the rules by which `ProjectionEq` can succeed; these two clauses are discussed
in detail in the [section on associated types](./traits-associated-types.html),
but reproduced here for reference:
```txt
```text
// Rule ProjectionEq-Normalize
//
// ProjectionEq can succeed by normalizing:
@ -227,7 +227,7 @@ the `Bounds` declared on the associated type must be proven to hold to
show that the impl is well-formed, and hence we can rely on them
elsewhere.
```txt
```text
// XXX how exactly should we set this up? Have to be careful;
// presumably this has to be a kind of `FromEnv` setup.
```
@ -253,7 +253,7 @@ where WC
Let `TraitRef` be the trait reference `A0: Trait<A1..An>`. Then we
will create the following rules:
```txt
```text
// Rule Implemented-From-Impl
forall<P0..Pn> {
Implemented(TraitRef) :- WC
@ -278,7 +278,7 @@ where WC
We produce the following rule:
```txt
```text
// Rule Normalize-From-Impl
forall<P0..Pm> {
forall<Pn+1..Pm> {

View File

@ -30,7 +30,7 @@ impl<T> Clone for Vec<T> where T: Clone { }
We could map these declarations to some Horn clauses, written in a
Prolog-like notation, as follows:
```txt
```text
Clone(usize).
Clone(Vec<?T>) :- Clone(?T).
@ -70,7 +70,7 @@ impl<T: Eq<U>> Eq<Vec<U>> for Vec<T> { }
That could be mapped as follows:
```txt
```text
Eq(usize, usize).
Eq(Vec<?T>, Vec<?U>) :- Eq(?T, ?U).
```
@ -105,7 +105,7 @@ If we wanted, we could write a Prolog predicate that defines the
conditions under which `bar()` can be called. We'll say that those
conditions are called being "well-formed":
```txt
```text
barWellFormed(?U) :- Eq(?U, ?U).
```
@ -113,7 +113,7 @@ Then we can say that `foo()` type-checks if the reference to
`bar::<usize>` (that is, `bar()` applied to the type `usize`) is
well-formed:
```txt
```text
fooTypeChecks :- barWellFormed(usize).
```
@ -144,7 +144,7 @@ To type-check the body of `foo`, we need to be able to hold the type
type-safe *for all types `T`*, not just for some specific type. We might express
this like so:
```txt
```text
fooTypeChecks :-
// for all types T...
forall<T> {

View File

@ -159,7 +159,7 @@ is to first "generalize" `&'a i32` into a type with a region variable:
`&'?b i32`, and then unify `?T` with that (`?T = &'?b i32`). We then
relate this new variable with the original bound:
```txt
```text
&'?b i32 <: &'a i32
```
@ -178,14 +178,14 @@ eagerly unifying things, we simply collect constraints as we go, but
make (almost) no attempt to solve regions. These constraints have the
form of an "outlives" constraint:
```txt
```text
'a: 'b
```
Actually the code tends to view them as a subregion relation, but it's the same
idea:
```txt
```text
'b <= 'a
```
@ -195,7 +195,7 @@ the `region_constraints` module for details.)
There is one case where we do some amount of eager unification. If you have an
equality constraint between two regions
```txt
```text
'a = 'b
```

View File

@ -62,7 +62,7 @@ enum OptionalMap<C> { Some(|C| -> C), None }
Here, we will generate the constraints:
```txt
```text
1. V(A) <= +
2. V(B) <= -
3. V(C) <= +
@ -74,11 +74,11 @@ These indicate that (1) the variance of A must be at most covariant;
variance of C must be at most covariant *and* contravariant. All of these
results are based on a variance lattice defined as follows:
```txt
```text
* Top (bivariant)
- +
o Bottom (invariant)
```txt
```text
Based on this lattice, the solution `V(A)=+`, `V(B)=-`, `V(C)=o` is the
optimal solution. Note that there is always a naive solution which
@ -89,7 +89,7 @@ is that the variance of a use site may itself be a function of the
variance of other type parameters. In full generality, our constraints
take the form:
```txt
```text
V(X) <= Term
Term := + | - | * | o | V(X) | Term x Term
```
@ -248,13 +248,13 @@ Maybe it's helpful to think of a dictionary-passing implementation of
type classes. In that case, `convertAll()` takes an implicit parameter
representing the impl. In short, we *have* an impl of type:
```txt
```text
V_O = ConvertTo<i32> for Object
```
and the function prototype expects an impl of type:
```txt
```text
V_S = ConvertTo<i32> for String
```
@ -264,7 +264,7 @@ The answer will depend on the variance of the various parameters. In
this case, because the `Self` parameter is contravariant and `A` is
covariant, it means that:
```txt
```text
V_O <: V_S iff
i32 <: i32
String <: Object
@ -279,7 +279,7 @@ expressions -- must be invariant with respect to all of their
inputs. To see why this makes sense, consider what subtyping for a
trait reference means:
```txt
```text
<T as Trait> <: <U as Trait>
```
@ -305,7 +305,7 @@ impl<T> Identity for T { type Out = T; ... }
Now if I have `<&'static () as Identity>::Out`, this can be
validly derived as `&'a ()` for any `'a`:
```txt
```text
<&'a () as Identity> <: <&'static () as Identity>
if &'static () < : &'a () -- Identity is contravariant in Self
if 'static : 'a -- Subtyping rules for relations