The previous analysis already covers a large gamut of realistic

cases. For instance, the analysis already handles list data

structures, since products and recursive types can encode them as

...

...

@@ -14,7 +15,19 @@ extend our work to records. Although we use the record \emph{types} as they are

defined in CDuce we cannot do the same for CDuce record \emph{expressions}

which require non-trivial modifications for occurrence typing,

especially because we want to capture the typing of record field extension and field

deletion, two widely used record operations that current gradual typing systems fail to capture. CDuce record types are obtained by extending types with the

deletion, two widely used record operations that current gradual

typing systems fail to capture.

\else%%%%%%%%%%%

Since the main application of occurrence

typing is to type dynamic languages, then it is important to show that our

work can handle records. We cannot handle them via a simple encoding into

pairs, since we need a precise type analysis for operations such

as field addition, update, and deletion. We therefore extend our approach to deal

with the records (types and expressions) as they are to be found in the aforementioned

CDuce language, where these operations can be precisely typed.

\fi%%%%%

CDuce record \emph{types} can be embedded in our types by adding the

following two type constructors:\\[1.4mm]

%

\centerline{\(\textbf{Types}\quad t ~ ::= ~ \record{\ell_1=t \ldots\ell_n=t}{t}\alt\Undef\)}\\[1.4mm]

...

...

@@ -49,11 +62,13 @@ field $\ell = t \vee \Undef$%

\ (note that ``$\ldots{}$'' is meta-syntax while ``{\large\textbf{..}}'' is syntax).

\fi

For what concerns expressions, we adapt CDuce records to our analysis. In particular records are built starting from the empty record expression \erecord{} by adding, updating, or removing fields:\vspace{-1mm}

For what concerns \emph{expressions}, we cannot use CDuce record expressions

as they are, but we must adapt them to our analysis. In particular, we

consider records that are built starting from the empty record expression \erecord{} by adding, updating, or removing fields:\vspace{-1.5mm}

\[

\begin{array}{lrcl}

\textbf{Expr}& e & ::=&\erecord{}\alt\recupd e \ell e \alt\recdel e \ell\alt e.\ell

\end{array}\vspace{-1mm}

\end{array}\vspace{-1.5mm}

\]

in particular $\recdel e \ell$ deletes the field $\ell$ from $e$, $\recupd e \ell e'$ adds the field $\ell=e'$ to the record $e$ (deleting any existing $\ell$ field), while $e.\ell$ is field selection with the reduction:

\(\erecord{...,\ell=e,...}.\ell\ \reduces\ e\).

...

...

@@ -62,7 +77,7 @@ To define record type subtyping and record expression type inference we need thr

%

Then two record types $t_1$ and $t_2$ are in subtyping relation, $t_1\leq t_2$, if and only if for all $\ell\in\Labels$ we have $\proj\ell{t_1}\leq\proj\ell{t_2}$. In particular $\orecord{\!\!}$ is the largest record type.

Expressions are then typed by the following rules (already in algorithmic form).

Expressions are then typed by the following rules (already in algorithmic form).\vspace{-3mm}

\begin{mathpar}

\Infer[Record]

{~}

...

...

@@ -73,8 +88,9 @@ Expressions are then typed by the following rules (already in algorithmic form).

title={Covariance and Contravariance: a fresh look at an

old issue (a primer in advanced type systems for

learning functional programmers)},

journal={Logical Method in Computer Science},

DMI-CATEGORY={jour},

FILE={covcon-again.pdf},

abstract={Twenty years ago, in an article titled "Covariance and contravariance: conflict without a cause", I argued that covariant and contravariant specialization of method parameters in object-oriented programming had different purposes and deduced that, not only they could, but actually they should both coexist in the same language.

In this work I reexamine the result of that article in the light of recent advances in (sub-)typing theory and programming languages, taking a fresh look at this old issue.

Actually, the revamping of this problem is just an excuse for writing an essay that aims at explaining sophisticated type-theoretic concepts, in simple terms and by examples, to undergraduate computer science students and/or willing functional programmers.

Finally, I took advantage of this opportunity to describe some undocumented advanced techniques of type-systems implementation that are known only to few insiders that dug in the code of some compilers: therefore, even expert language designers and implementers may find this work worth of reading.},

note={(Revised version: first version 2013). To appear. \url{https://arxiv.org/abs/1809.01427}.},

@@ -51,10 +54,6 @@ where $ \Pf $ denotes the restriction of the powerset to finite subsets:

\span\span

\span\span

\end{align*}

where $\ConstantsInBasicType{()}$ is the function

that assigns to each basic type the set of constants of that type, so

that $ c\in\ConstantsInBasicType(\basic{c})$.

We cannot take the equations above

directly as an inductive definition of $\TypeInter{}$

because types are not defined inductively but coinductively.

...

...

@@ -64,7 +63,7 @@ Definition~\ref{def:types} ensures that the binary relation $\vartriangleright

t_i$, $t_1\land t_2\vartriangleright

t_i$, $\neg t \vartriangleright t$ is Noetherian.

This gives an induction principle \footnote{In a nutshell, we can do proofs by induction on the structure of unions and negations---and, thus, intersections---but arrows, products, and basic types are the base cases for the induction.} on $\types$ that we

we use to give the following definition (due to~\citet{types18post}

use combined with structural induction on $\Domain$ to give the following definition (due to~\citet{types18post}

and equivalent to but simpler than the definition in~\cite{Frisch2008}),

which validates these equalities.

...

...

@@ -89,10 +88,6 @@ We define the \emph{set-theoretic interpretation}

$\TypeInter{} : \types\to\Pd(\Domain)$

as $\TypeInter{t}=\{d \in\Domain\mid(d : t)\}$.

\end{definition}

Notice that, since since the binary predicate $

(d : t)$ is defined on $\Domain\times\types$ and

$\Omega\not\in\Domain$, then $\Omega\not\in\TypeInter t$.

Finally,

we define the subtyping preorder and its associated equivalence relation