Seeking intuition for LK sequent calculus ${\rightarrow}L$ inference rule

sequent-calculus

This Wikipedia article states the LK (${\rightarrow}L$) rule as:

$$\Gamma \vdash A, \Delta \qquad \Sigma, B \vdash \Pi \over \Gamma, \Sigma, A \rightarrow B \vdash \Delta, \Pi$$

I'm trying to find some confirmation or intuition of why this is so.

First, I assume that there is a derivation of "$\Gamma \vdash A, \Delta$" and of "$\Sigma, B \vdash \Pi$" using the rules of the LK sequent calculus (I'm using the wording from this comment [1]).

I am interpreting "$\Gamma, \Sigma, A \rightarrow B \vdash \Delta, \Pi$" as saying: "In the joint context of $\Gamma$, $\Sigma$ and $A \rightarrow B$, at least one of $\Delta$ or $\Pi$ must be true." So my ultimate goal is to confirm that in the joint context of $\Gamma$, $\Sigma$ and $A \rightarrow B$ (or a subset of them), at least one of $\Delta$ or $\Pi$ is true under all possible assignments/cases of $A$ and $B$:

$A$ $B$ $A \rightarrow B$ Intuition
0 x true Given that "$\Gamma \vdash A, \Delta$", in the context of $\Gamma$ if $A$ is false, then it necessary for $\Delta$ to be true, so $\Delta$ is true.
1 0 false Since $A \rightarrow B$ is false, we can ignore this case because "$\Gamma, \Sigma, A \vdash \Delta, \Pi$" makes no assertions about what happens when $A \rightarrow B$ is false.
1 1 true In the context of $\Sigma$ and $B$, $\Pi$ is true.

Does this make sense? If you have a cleaner or more concise way to get this intuition, I'm interested in seeing it.


The sequent $\Gamma \vdash \Delta$ means that there is a formal derivation of this sequent using the rules of the LK sequent calculus. [1]

I know the rule is part of the definition of ⊢, so by definition, the rule is true [2], but I would still like to build some intuition about it.

Best Answer

I don't see any issue in your own way of semantic interpretation of this $\to$ left rule and you did a good job of case analysis for the material conditional, just bear in mind those contexts in sequent calculi are really multi-set of propositions.


Another common way to interpret left rules is to regard them as corresponding elimination rules in ND system. In this case if we have $\Gamma, \Sigma, A \rightarrow B$ as our assumption under the line, we can first weaken the left assumptions for both sequents above the line by adding $A \rightarrow B$ and $\Sigma$ to the left sequent assumption above the line and adding $\Gamma$ and $A \rightarrow B$ to the right sequent assumption above the line. Then you can easily see the left sequent above the line can replace $A$ in the succedent with $B$ after applying MP, then by the cut rule perhaps you now can see $B$ can be further eliminated from both sides of the sequents above the line. In this way this rule is really another interpretation of the implication elimination rule (MP) in ND system (there's no cut rule in ND)...


In notation and slightly more streamlined, start with the two sequents above the line:

$$\Gamma \vdash A, \Delta \qquad \Sigma, B \vdash \Pi$$

Weaken the assumption of the left sequent by adding $A \rightarrow B$. We now have:

$$\Gamma, A \rightarrow B \vdash A, \Delta \qquad \Sigma, B \vdash \Pi$$

In the left sequent, we have $A \rightarrow B$ as an assumption and $A$ as a succedent, thus by the MP (implication elimination) rule, we can add $B$ as a succedent or also valid, replace succedent $A$ with succedent $B$. We now have:

$$\Gamma, A \rightarrow B \vdash B, \Delta \qquad \Sigma, B \vdash \Pi$$

We're ready to apply the cut rule, which looks like (copied over from Wikipedia, but with terms renamed and to avoid confusion):

$$G \vdash B, D \qquad S, B \vdash P \over G, S \vdash D, P$$

We do some pattern matching (e.g. $G$ represents "$\Gamma, A \rightarrow B$"; $B$ represents "$B$", etc.), and apply the cut rule to get the sequent we were after all along:

$$\Gamma, \Sigma, A \rightarrow B \vdash \Delta \Pi$$.

Related Question