Commit cba64761 authored by Giuseppe Castagna's avatar Giuseppe Castagna
Browse files

typos

parent 400077b7
......@@ -19,7 +19,7 @@ rules%
not.\svvspace{-3.3mm}}
such as \Rule{Subs} and \Rule{Inter}. We handle this presence
in the classic way: we define an algorithmic system that tracks the
miminum type of an expression; this system is obtained from the
minimum type of an expression; this system is obtained from the
original system by removing the two structural rules and by
distributing suitable checks of the subtyping relation in the
remaining rules. To do that in the presence of set-theoretic types we
......@@ -251,7 +251,7 @@ refinement is meaningful only when the function is typed by a union
type, but also we had to build the expression that causes the
divergence in quite an \emph{ad hoc} way which makes divergence even
more unlikely: setting an $n_o$ twice the depth of the syntax tree of
the outermost type case should be enough to capture all realistic
the outermost type case should be more than enough to capture all realistic
cases. For instance, all examples given in Section~\ref{sec:practical}
can be checked (or found to be ill-typed) with $n_o = 1$.
\fi
......@@ -366,7 +366,7 @@ system is not complete as we discuss in details in the next section.
\subsubsection{Properties of the algorithmic system}\label{sec:algoprop}
\rev{%%%
In what follow we will use $\Gamma\vdashA^{n_o} e:t$ to stress the
fact that the judgement $\Gamma\vdashA e:t$ is provable in the
fact that the judgment $\Gamma\vdashA e:t$ is provable in the
algorithmic system where $\Refinef_{e,t}$ is defined as
$(\RefineStep{e,t})^{n_o}$; we will omit the index $n_o$---thus keeping
it implicit---whenever it does not matter in the context.
......@@ -390,7 +390,7 @@ be associated to a set of types of the form $\{s\alt
of $\vdashAts$.
Completeness needs a more detailed explaination. The algorithmic
Completeness needs a more detailed explanation. The algorithmic
system $\vdashA$
is not complete w.r.t.\ the language presented in
Section~\ref{sec:syntax} because it cannot deduce negated arrow
......
......@@ -91,6 +91,6 @@ account for polymorphism.
This needs a whole gamut of non trivial research that we plan to
develop in the near future building on the work on polymorphic types
for semantic subtyping~\cite{CX11} and the research on the definition of
polymophic languages with set-theoretic types
polymorphic languages with set-theoretic types
by~\citet{polyduce2,polyduce1,CPN16} and \citet{Pet19phd}.
\fi
......@@ -12,7 +12,7 @@ argument. The annotation specifies that the parameter is either a
number or a string (the vertical bar denotes a union type). If this annotation is respected and the function
is applied to either an integer or a string, then the application
cannot fail because of a type error (\code{trim()} is a string method of the ECMAScript 5 standard that
trims whitespaces from the beginning and end of the string) and both the type-checker of TypeScript and the one of Flow
trims white-spaces from the beginning and end of the string) and both the type-checker of TypeScript and the one of Flow
rightly accept this function. This is possible because both type-checkers
implement a specific type discipline called \emph{occurrence typing} or \emph{flow
typing}:\footnote{%
......
......@@ -61,7 +61,7 @@ by~\citet{Frisch2008}
\iflongversion
and detailed description of the algorithm to
decide this relation can be found in~\cite{Cas15}. For the reader's
convenience we succintly recall the definition of the subtyping
convenience we succinctly recall the definition of the subtyping
relation in the next subsection but it is possible to skip this
subsection at first reading and jump directly to Subsection~\ref{sec:syntax}, since to understand
the rest of the paper
......
......@@ -269,8 +269,8 @@ and constructs a more precise \emph{dependent} type. In the dependent
type, fresh variables are introduced to name sub-expressions and
record the new constraints.
This
process---called \emph{selfification} in the cited works---roughly
corresponds to a our $\constrf$ and \Refinef{} functions (see Section~\ref{sec:typenv}).
process---called in the cited works \emph{selfification}---roughly
corresponds to our $\constrf$ and \Refinef{} functions (see Section~\ref{sec:typenv}).
Another approach is the one followed by \citet{RKJ08} which is
completely based on a program transformation, namely, it consists in putting the term
in \emph{administrative normal form} (\cite{SF92}). Using a program
......@@ -382,13 +382,13 @@ which is a pointer to the environment for the type hypothesis about
the expression and, as such, it plays the role of our extended type
environments. Likewise, the \emph{selfification} of \cite{OTMW04}
and \cite{KF09}, propagates the precise type constraints learned
during a test. On difference is that with refinement types, the
during a test. One difference with our approach is that with refinement types the
information can be
kept at the level of types types, since dependent types contain terms and can
introduce variables, while in our approach the mapping is kept in an
environment. Although tracking of types for structured expressions thus
seems thus an aspect commont to different approaches to occurrence types,
we are confident that even this last non-standard aspect
kept at the level of types, since dependent types contain terms and can
introduce variables, while in our approach the mapping is kept separate in a
type environment. In summary the tracking of types for structured expressions
seems an aspect common to different approaches to occurrence types,
nevertheless we are confident that even this last non-standard aspect
of our system can be removed and that occurrence typing can be
explained in a pure standard type-theoretic setting.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment