Commit 80e69565 by Giuseppe Castagna

### Various rewording and typos in Section 2

parent bef67da2
 ... ... @@ -30,7 +30,7 @@ set). It also ensures that the binary relation $\vartriangleright \,\subseteq\!\types^{2}$ defined by $t_1 \lor t_2 \vartriangleright t_i$, $t_1 \land t_2 \vartriangleright t_i$, $\neg t \vartriangleright t$ is Noetherian. t_i$,$\neg t \vartriangleright t$is Noetherian.\footnote{In a nutshell, we can do proofs of induction on the structure of unions and negations---and, thus, intersections---but arrows, products, and basic types are the base cases for the induction.} This gives an induction principle on$\types$that we will use without any further explicit reference to the relation. We refer to$ b $,$\times$, and$ \to $as \emph{type constructors} ... ... @@ -46,7 +46,7 @@ Section~\ref{sec:syntax} right below) that have that type, and that subtyping is containment (\emph{ie}, a type$s$is a subtype of a type$t$if and only if$t$contains all the values of type$s$). In particular,$s\to t$contains all$\lambda$-abstractions that when applied to a value of type$s$, if the computation terminates, then they return a result of type$s$, if their computation terminates, then they return a result of type$t$(\emph{eg},$\Empty\to\Any$is the set of all functions\footnote{\label{allfunctions}Actually, for every type$t$, all types of the form$\Empty{\to}t$are equivalent and each of them ... ... @@ -80,8 +80,8 @@ intuitively,$\lambda^{\wedge_{i\in I}s_i\to t_i} x.e$is a well-typed value if for all$i{\in} I$the hypothesis that$x$is of type$s_i$implies that the body$e$has type$t_i$, that is to say, it is well typed if$\lambda^{\wedge_{i\in I}s_i\to t_i} x.e$has type$s_i\to t_i$for all$i\in I$. Every value is associated to a type: we described above the types of constants and abstractions and, inductively, t_i$ for all $i\in I$. Every value is associated to a type: the type of $c$ is $\basic c$; the type of $\lambda^{\wedge_{i\in I}s_i\to t_i} x.e$ is $\wedge_{i\in I}s_i\to t_i$; and, inductively, the type of a pair of values is the product of the types of the values. ... ... @@ -100,7 +100,7 @@ The dynamic semantics is defined as a classic left-to-right cbv reduction for a \] The semantics of type-cases uses the relation $v\in t$ that we informally defined in the previous section. We delay its formal definition to Section~\ref{sec:type-schemes}. Context reductions are definition to Section~\ref{sec:type-schemes} (it deals with some corner cases for functional values). Context reductions are defined by the following reduction contexts: $\Cx[] ::= [\,]\alt \Cx e\alt v\Cx \alt (\Cx,e)\alt (v,\Cx)\alt \pi_i\Cx\alt \tcase{\Cx}tee ... ... @@ -111,7 +111,7 @@ \Cx[e]\reduces\Cx[e']. \subsection{Static semantics} While the syntax and reduction semantics are on the whole pretty While the syntax and reduction semantics are, on the whole, pretty standard, for the type system we will have to introduce several unconventional features that we anticipated in Section~\ref{sec:challenges} and are at the core of our work. Let ... ... @@ -175,12 +175,12 @@ types. This explains why the classic typing rule for variables is replaced by a { \Gamma \vdash e: \Gamma(e) } { e\in\dom\Gamma } \qquad \Infer[Intersect] \Infer[Inter] { \Gamma \vdash e:t_1\\\Gamma \vdash e:t_2 } { \Gamma \vdash e: t_1 \wedge t_2 } { } \end{mathpar} The \Rule{Env} rule is coupled with the classic intersection introduction rule The \Rule{Env} rule is coupled with the standard intersection introduction rule \Rule{Inter} which allow us to deduce for a complex expression the intersection of the types recorded by the occurrence typing analysis in the environment \Gamma with the static type deduced for the same ... ... @@ -203,7 +203,7 @@ (\Int\to\Int)\setminus(\Int\to\neg\Int) ). But the rule \Rule{Abs+} above does not allow us to deduce negations of arrows for abstractions: the rule \Rule{Abs-} makes this possible. As an aside, note that this kind of deduction was already present in the system by~\citet{Frisch2008} though it that system this deduction was motivated by the semantics of types rather than though it that system this deduction was motivated by the semantics of types rather than, as in our case, by the soundness of the type system. Rules \Rule{Abs+} and \Rule{Abs-} are not enough to deduce for ... ... @@ -224,7 +224,7 @@ typing) the expression x+1 is of type \Int{} (which holds) and that under the hypothesis that x has type \Int\setminus\Int, that is \Empty (we apply once more occurrence typing), \textsf{true} is of type \Int{} (which \emph{does not} hold). The problem is that we are trying to type the second case of a type-case even if we know that there is no chance that, when x is bound to an \Int, second case of a type-case even if we know that there is no chance that, when x is bound to an integer, that case will be ever selected. The fact that it is never selected is witnessed by the presence of a type hypothesis with \Empty type. To avoid this problem (and type the term above) we add the rule ... ... @@ -257,15 +257,18 @@ type-case:\beppe{Changed to \Rule{If} to \Rule{Case}} {\Gamma\vdash \tcase {e} t {e_1}{e_2}: t'} { } \end{mathpar} The rule \Rule{Case} checks whether the expression e, whose type is being tested, is well-typed and then performs the occurrence typing analysis that produces the environments \Gamma_i's under whose hypothesis the expressions e_i's are typed. The production of these environments is represented by the judgment \Gamma \evdash e t \Gamma_i. The intuition is that when \Gamma \evdash e t \Gamma_i is provable then \Gamma_i is a version of \Gamma extended with type hypotheses for all expressions occurring in e, type hypotheses that can be deduced assuming that the test e\in t succeeds. The rule \Rule{Case} checks whether the expression e, whose type is being tested, is well-typed and then performs the occurrence typing analysis that produces the environments \Gamma_i's under whose hypothesis the expressions e_i's are typed. The production of these environments is represented by the judgments \Gamma \evdash e {(\neg)t} \Gamma_i. The intuition is that when \Gamma \evdash e t \Gamma_1 is provable then \Gamma_1 is a version of \Gamma extended with type hypotheses for all expressions occurring in e, type hypotheses that can be deduced assuming that the test e\in t succeeds. Likewise, \Gamma \evdash e {\neg t} \Gamma_2 (notice the negation on t) extends \Gamma with the hypothesis deduced assuming that e\in\neg t, that is, for when the test e\in t fails. All it remains to do is to show how to deduce judgments of the form \Gamma \evdash e t \Gamma'. For that we first have to define how ... ... @@ -279,6 +282,7 @@ Let e be an expression and \varpi\in\{0,1,l,r,f,s\}^* a \emph{path}; we denote \occ e\varpi the occurrence of e reached by the path \varpi, that is \[ \begin{array}{l} \begin{array}{r@{\downarrow}l@{\quad=\quad}l} e&\epsilon & e\\ e_0e_1& i.\varpi & \occ{e_i}\varpi\qquad i=0,1\\ ... ... @@ -286,13 +290,15 @@ e_0e_1& i.\varpi & \occ{e_i}\varpi\qquad i=0,1\\ (e_0,e_1)& r.\varpi & \occ{e_1}\varpi\\ \pi_1 e& f.\varpi & \occ{e}\varpi\\ \pi_2 e& s.\varpi & \occ{e}\varpi\\ \end{array}\\ \text{undefined otherwise} \end{array}$ undefined otherwise To ease our analysis we used different directions for each kind of term. So we have $0$ and $1$ for the function and argument of an application, $l$ and $r$ for the $l$eft and $r$ight expressions forming a pair application, $l$ and $r$ for the $l$eft and $r$ight expressions forming a pair, and $f$ and $s$ for the argument of a $f$irst or of a $s$econd projection. The judgments $\Gamma \evdash e t \Gamma'$ are then deduced by the following two rules: \begin{mathpar} ... ... @@ -321,14 +327,14 @@ deduce from $\Gamma$ all the hypothesis already in $\Gamma$ (rule \Rule{Base}) and that $(ii)$ if we can deduce a given type $t'$ for a particular occurrence $\varpi$ of the expression $e$ being checked, than we can add this hypothesis to the produced type environment (rule \Rule{Path}). The rule \Rule{Path} uses a (last) auxiliary judgement $\pvdash {\Gamma} p e t \Rule{Path} uses a (last) auxiliary judgement$\pvdash {\Gamma} e t \varpi:t'$to deduce the type$t'$of the occurrence$\occ e \varpi$when checking$e$against$t$under the hypotheses$\Gamma$. This rule \Rule{Path} is subtler than it may appear at first sight, insofar as the deduction of the type for$\varpi$may already use some hypothesis on$\occ e \varpi$(in$\Gamma'$) and, from an algorithmic viewpoint, this will imply the computation of a fix-point (see Section~\ref{sec:typenv}). The last ingredient for our type system is the deduction of the judgements of the form$\pvdash {\Gamma} p e t \varpi:t'$where judgements of the form$\pvdash {\Gamma} e t \varpi:t'$where$\varpi$is a path to an expression occurring in$e$. This is given by the following set of rules. ... ... @@ -338,7 +344,7 @@ of rules. { \pvdash \Gamma e t \varpi:t_2 } { } \qquad \Infer[PIntersect] \Infer[PInter] { \pvdash \Gamma e t \varpi:t_1 \\ \pvdash \Gamma e t \varpi:t_2 } { \pvdash \Gamma e t \varpi:t_1\land t_2 } { } ... ... @@ -388,13 +394,12 @@ of rules. These rules implement the analysis described in Section~\ref{sec:ideas} for functions and extend it to products. Let us comment each rule in detail. \Rule{PSubs} is just subsumption for the deduction$\vdashp$. The rule \Rule{PIntersect} combined with the deduction$\vdashp$. The rule \Rule{PInter} combined with \Rule{PTypeof} allows the system to deduce for an occurrence$\varpi$the intersection of the static type of$\occ e \varpi$(deduced by \Rule{PTypeof}) and the type deduced for$\varpi$by the other$\vdashp$rules. The rule \Rule{PEps} is the starting point of the analysis: when checking whether$e$has type$t$we can assume that$e$(\emph{ie},$\occ e\epsilon$) has type$t$. \Rule{PTypeof}) with the type deduced for$\varpi$by the other$\vdashp$rules. The rule \Rule{PEps} is the starting point of the analysis: if we are assuming that the test$e\in t$succeeds, then we can assume that$e$(\emph{ie},$\occ e\epsilon$) has type$t$(recall that assuming that the test$e\in t$fails corresponds to having$\neg t$at the index of the turnstyle). The rule \Rule{PApprR} implements occurrence typing for the arguments of applications, since it states that if a function maps arguments of type$t_1$in results of type$t_2$and an application of ... ... @@ -404,7 +409,7 @@ occurrence typing analysis for the function part of an application, since it states that if an application has type$t_2$and the argument of this application has type$t_1$, then the function in this application cannot have type$t_1\to\neg t_2$. Rules \Rule{PPair\_} are straightforward since they state that$i$-th projection of a pair are straightforward since they state that the$i$-th projection of a pair of type$\pair{t_1}{t_2}$must be of type$t_i$. So are the last two rules that essentially state that if$\pi_1 e$(respectively,$\pi_2 e$) is of type$t'$, then the type of$e$must be of the form ... ... @@ -432,8 +437,8 @@ illustrated in the introduction and it is sound. Now the problem is to decide whether an expression is well typed or not, that is, to find an algorithm that given a type environment$\Gamma$and an expression$e$decides whether there exists a type$t$such that$\Gamma\vdash e:t$is probable. For that we need to solve essentially three problems:$(i)$how to handle the fact that it is possible to deduce several is provable. For that we need to solve essentially two problems:$(i)$~how to handle the fact that it is possible to deduce several types for the same well-typed expression$(ii)$how to compute the auxiliary deduction system for paths. ... ... @@ -444,38 +449,38 @@ infinitely many types, that is the annotation of the function intersected with as many negations of arrow types as it is possible without making the type empty. To handle this multiplicity we use and extend the technique of \emph{type schemes} defined by~\citet{Frisch2004}. Type schemes are canonical representations of the infinite set of types of$\lambda$-abstractions and they are presented in Section~\ref{sec:type-schemes}. The second origin is due by~\citet{Frisch2008}. Type schemes---whose definition we recall in Section~\ref{sec:type-schemes}---are canonical representations of the infinite set of types of$\lambda$-abstractions. The second origin is due to the presence of structural rules\footnote{In logic, logical rules refer to a particular connective (here, a type constructor, that is, either$\to$, or$\times$, or$b$), while identity rules (e.g., axioms and cuts) and structural rules (e.g., weakening and contraction) do not.} such as \Rule{Subs} and \Rule{Intersect}. We contraction) do not.} such as \Rule{Subs} and \Rule{Inter}. We handle it in the classic way: we define an algorithmic system that tracks the miminum type (actually, minimum type scheme) of an expression which is obtained from the original system by eliminating the two structural rules and distrubuting checks of the subtyping tracks the miminum type (actually, minimum \emph{type scheme}) of an expression; this system is obtained from the original system by eliminating the two structural rules and by distributing suitable checks of the subtyping relation in the remaining rules. To do that in the presence of set-theoretic types we need to define some operators on types, given in section~\ref{sec:typeops}. set-theoretic types we need to define some operators on types, which are given in Section~\ref{sec:typeops}. For what concerns the use of the auxiliary derivation for the$ $judgments, we present in Section~\ref{sec:typenv} an algorithm that we prove sound \beppe{and complete?}. All these notions are used in the algorithmic typing system given in Section~\ref{sec:algorules}. judgments, we present in Section~\ref{sec:typenv} an algorithm for which we prove the property of soundness and a limited form of completeness. All these notions are then used in the algorithmic typing system given in Section~\ref{sec:algorules}. %% \beppe{\begin{enumerate} %% \item type of functions -> type schemes %% \item elimination rules (app, proj,if) ->operations on types and how to compute them %% \item not syntax directed: rules Subs, Intersect, Env. %% \item not syntax directed: rules Subs, Inter, Env. %% \item Compute the environments for occurrence typing. Algorithm to compute$\Gamma\vdash\Gamma$%% \end{enumerate} %% } \subsubsection{Type schemes}\label{sec:type-schemes} We introduce the new syntactic category of \emph{types schemes} which are the terms$\ts$produced by the following grammar. We introduce the new syntactic category of \emph{types schemes} which are the terms$\ts$inductively produced by the following grammar. \[ \begin{array}{lrcl} \textbf{Type schemes} & \ts & ::= & t \alt \tsfun {\arrow t t ; \cdots ; \arrow t t} \alt \ts \tstimes \ts \alt \ts \tsor \ts \alt \tsempty ... ... @@ -504,7 +509,7 @@ Note that$\tsint \ts$is closed under subsumption and intersection empty set of types is different from$\Empty$whose interpretation is the set of all types. \begin{lemma} \begin{lemma}[\cite{Frisch2008}] Let$\ts$be a type scheme and$t$a type. It is possible to decide the assertion$t \in \tsint \ts$, which we also write$\ts \leq t$. \end{lemma} ... ... @@ -520,7 +525,7 @@ the value. By induction on the definition of values:$\tyof c {} = v{}\leq t$. We also need to perform intersections of type schemes so as to intersect the static type of an expression (\emph{ie}, the one deduced by conventional rules) with the one deduced by occurrence typing (\emph{ie}, the one derived by$\vdashp$). For our algorithmic system (see \Rule{Env$_{\scriptscriptstyle\mathcal{A}}$} in Section~\ref{sec:algorules}) all we need to define is the intersection of a type scheme with a type: \begin{lemma} \begin{lemma}[\cite{Frisch2008}] Let$\ts$be a type scheme and$t$a type. We can compute a type scheme, written$t \tsand \ts$, such that $$\tsint {t \tsand \ts} = \{s \alt \exists t' \in \tsint \ts.\ t \land t' \leq s \}$$ \end{lemma} ... ... @@ -547,10 +552,10 @@ Note that$\tsrep \ts \in \tsint \ts$. {} \subsubsection{Operators for type constructors}\label{sec:typeops} {}\beppe{The explaination that follows is redundant in Section~\ref{sec:ideas}. Harmonize!} {}\beppe{The explaination that follows is redundant with Section~\ref{sec:ideas}. Harmonize!} In order to define the algorithmic typing of expressions like applications and projections we need to define the operator on applications and projections we need to define the operators on types we used in Section~\ref{sec:ideas}. Consider the rule \Rule{App} for applications. It essentially does three things:$(1)$it checks that the function has functional type;$(2)$it checks that the argument is in the domain of the ... ... @@ -565,7 +570,7 @@ union of intersection of arrows and negations of types. Checking that the function has a functional type is easy since it corresponds to checking that it has a type subtype of$\Empty\to\Any$. Determining its domain and the type of the application is more complicated. For instance if we have a function of type \code{$$t=(\Int \to \Int)$$ $$\wedge$$ $$(\Bool \to \Bool)$$}, which instance, imagine we have a function of type \code{$$t=(\Int \to \Int)$$ $$\wedge$$ $$(\Bool \to \Bool)$$}, which denotes functions that will return an integer if applied to an integer, and will return a Boolean if applied to a Boolean. It is possible to compute the domain of such a type ... ... @@ -581,13 +586,13 @@ an application of a function of type $$t_1$$ to an argument of type $$t_2$$. In the example with \code{$$t=(\Int \to \Int)$$ $$\wedge$$ $$(\Bool \to \Bool)$$}, it gives \code{$$t \circ \Int = \Int$$}, \code{$$t \circ \Bool = \Bool$$}, and \code{$$t \circ (\Int\vee\Bool) = \Int \vee \Bool$$}. In summary, given a functional type$t$(\emph{ie}, a type$t$such that$t\leq\Empty\to\Any$) our algorithms we use the following three operators \code{$$t \circ (\Int\vee\Bool) = \Int \vee \Bool$$}. In summary, given a functional type$t$(\emph{ie}, a type$t$such that$t\leq\Empty\to\Any$) our algorithms will use the following three operators \begin{eqnarray} \dom t & = & \max \{ u \alt t\leq u\to \Any\} \\ \apply t s & = &\min \{ u \alt t\leq s\to u\} \\ \worra t s = \min\{u \alt t\circ(\dom t\setminus u)\leq \neg s\}\label{worra} \worra t s & = &\min\{u \alt t\circ(\dom t\setminus u)\leq \neg s\}\label{worra} \end{eqnarray} The first two operators belongs to the theory of semantic subtyping while the last one is new and we described it in Section~\ref{sec:ideas} ... ... @@ -597,7 +602,7 @@ We need similar operators for projections since the type$t$of$e$in$\pi_i e$\bpl t & = & \min \{ u \alt t\leq \pair u\Any\}\\ \bpr t & = & \min \{ u \alt t\leq \pair \Any u\} \end{eqnarray} All the operators above but$\worra{}{}$are already present in the theory of semantic subtyping. The reader can find how to compute them and how to extend them to type schemes in~\cite[Section 6.11]{Frisch2008}. Below we just show the formula that computes$\worra t s$for a$t$subtype of$\Empty\to\Any$. For that, we use a result of semantic subtyping that states that every type$t$is equivalent to a type in disjunctive normal form and that if furthermore$t\leq\Empty\to\Any$, then$t \simeq \bigvee_{i\in I}\left(\bigwedge_{p\in P_i}(s_p\to t_p)\bigwedge_{n\in N_i}\neg(s_n'\to t_n')\right)$with$\bigwedge_{p\in P_i}(s_p\to t_p)\bigwedge_{n\in N_i}\neg(s_n'\to t_n') \not\simeq \Empty$for all$i$in$I$. For such a$t$and any type$s$then we have: All the operators above but$\worra{}{}$are already present in the theory of semantic subtyping. The reader can find how to compute them and how to extend their definition to type schemes in~\cite[Section 6.11]{Frisch2008}. Below we just show the formula that computes$\worra t s$for a$t$subtype of$\Empty\to\Any$. For that, we use a result of semantic subtyping that states that every type$t$is equivalent to a type in disjunctive normal form and that if furthermore$t\leq\Empty\to\Any$, then$t \simeq \bigvee_{i\in I}\left(\bigwedge_{p\in P_i}(s_p\to t_p)\bigwedge_{n\in N_i}\neg(s_n'\to t_n')\right)$with$\bigwedge_{p\in P_i}(s_p\to t_p)\bigwedge_{n\in N_i}\neg(s_n'\to t_n') \not\simeq \Empty$for all$i$in$I$. For such a$t$and any type$s\$ then we have: % \worra t s = \dom t \wedge\bigvee_{i\in I}\left(\bigwedge_{\{P\subset P_i\alt s\leq \bigvee_{p \in P} \neg t_p\}}\left(\bigvee_{p \in P} \neg s_p\right) \right) ... ...
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!